TOP-10 audit reports to trust
In licensed iGaming, "trust but verify" are specific documents. Below are ten reports that give an objective picture of the honesty of the content and processes of the provider/operator. Save as a due diligence checklist.
1) RNG Certification/Evaluation Report
Which confirms: unpredictability of the random number generator, lack of correlations, correct sitting and rotation.
See the report: methods (NIST/Diehard/TestU01), sample sizes, p-values, RNG version and test date.
Red flags: vague methods, small samples, overdue date, lack of data on sitting.
Frequency: when changing RNG/engine or significant version.
2) Report on mathematics and RTP (Game Math/RTP Verification)
What confirms: compliance of the actual return with the declared RTP on large simulations, volatility profile, frequency of hits/bonuses.
See the report: confidence intervals, win distributions, cap/round checks, a list of all RTP options.
Red flags: one RTP is indicated in marketing, and others appear in the report; too wide intervals; no event frequencies.
Frequency: for each version of the game and each RTP version.
3) Functional/Game Rules Compliance
What confirms: the correctness of the rules/payment table, the work of bonuses, behavior in regional scenarios (disconnection, rollback, car backs).
See the report: a list of scenarios, results, resource versions (paytable, help), build dates.
Red flags: no edge cases, no screenshots/playback logs.
Frequency: for any editing of mechanics, payments, help interface.
4) Software Verification/Build Integrity
What confirms: that it is a certified assembly that is spinning in production.
See the report: hashes/file signatures, list of artifacts, certificate binding and versions.
Red flags: operator hashes mismatch, no signature, gray files off the list.
Frequency: on each release and during field inspections.
5) RGS/Process Audit Report
What confirms: isolation of the game kernel on the provider's servers, access control, change-management, logging of admin actions.
See the report: architectural schemes, IAM policies, CI/CD, separation of environments, dual-control on critical operations.
Red flags: manual edits in the prod, general admin account, no magazines.
Periodicity: annually or with significant architectural changes.
6) Jurisdictional/Regulatory Compliance
What confirms: compliance with local rules: RTP/warning visibility, reporting format, rate limits, RG requirements.
See the report: list of jurisdictions, links to requirements, UI screenshots, tests of reporting uploads.
Red flags: one general report "for all markets," without specifics on requirements.
Frequency: when entering a new market and when changing the rules.
7) Post-Market Statistical Monitoring Report
What confirms: the absence of anomalies after the release of large samples.
See in the report: RTP-drift in confidence intervals, bonus/hit frequencies, network jackpots, alert methodology.
Red flags: persistent deviations without investigations, "manual" data sampling.
Frequency: monthly/quarterly.
8) Field report (Proof-on-Production/Field Inspection)
Which confirms: the coincidence of the version/hashes and the behavior of the game in the real operator with the standard.
See in the report: the matrix "operator → version → hash," sample play with round ID and verification of RGS logs, help screens (RTP/version).
Red flags: "unique" assemblies for only one site, help and certificate discrepancies.
Frequency: selectively, according to plan or during escalations.
9) Safety Report (ISO/IEC 27001/SOC 2 Type II Summary)
What confirms: maturity of the security management system: IAM, key/secret management, logs, redundancy, risk management.
See in the report: scope, control domains, exceptions, remediation deadlines.
Red flags: "certificate pending," no plan to close remarks, severely curtailed scope.
Frequency: annually (SOC 2 Type II - for the reporting period; ISO - supervision cycle).
10) Penetration Test/VA Report
What confirms: checking the external perimeter, RGS/API, admin portals and client applications for vulnerabilities.
See in the report: methodology (OWASP), criticality of finds (CVSS), evidence of operation, fix time, re-verification.
Red flags: "scanner for the sake of the scanner," without attempts to operate; open critical vulnerabilities without deadlines.
Frequency: at least once a year and after major releases.
How to quickly verify any report (mini-checklist)
1. Identity: the version of the game/engine and the date of the test coincide with the sale.
2. Methods: standards, data volumes, passage criteria are indicated.
3. Traceability: there are hash lists, round ID, links to certificates.
4. Comments: listed and provided with remediation plan (owners, timing).
5. Relevance: the report is not overdue; there is a history of perestes.
Frequent errors when reading reports
Confuse marketing badge with technical certification. Need a bundle with versions and hashes.
Watch RTP only and ignore processes. Without change-management and IAM, any certificate loses price.
Rely on old reports. Content and requirements are changing - check dates and re-lists.
Template for requesting documents to the provider/operator
If the provider and operator have a full set of ten reports, relevant by dates and versions, with hash lists and remediation plans, you have a mature and transparent ecosystem. Everything else is a reason to go deeper, ask questions and limit traffic until evidence is obtained.
