- 4 Minutes to read
- DarkLight
Security and Compliance
- 4 Minutes to read
- DarkLight
Dataddo serves a wide range of organizations, from startups to established industry players. We prioritize the protection of customer data and aim to be transparent about our data usage practices.
Dataddo has instituted a resilient set of security protocols and controls to ensure customer data remains protected. These protocols encompass
- Regular audits,
- Vulnerability assessments,
- Multi-factor authentication, and
- Encryption of confidential data.
Beyond this, we strictly adhere to the highest global standards and regulatory compliance, including GDPR and SOC 2 Type II. We continually refine our security practices to safeguard customer data as we recognize the importance of meeting the most demanding security and privacy requirements of our customers.
Comprehensive Security Measures
Dataddo provides advanced enterprise-level security features, ensuring robust controls for your data integration tasks.
Controlled Network Access
Our infrastructure is fully isolated in a private network that is inaccessible from the outside environment. We employ various access control strategies, ensuring only a select group of Dataddo reliability engineers access the systems. These control strategies include background checks, regular trainings, and continuous monitoring of access logs.
To further strengthen security, access to the systems requires multi-factor authentication (MFA) via a secure bastion host, ensuring your data's privacy and safety at all times. The bastion host uses strict access controls and logging mechanisms to ensure that all actions are tracked and auditable.
Robust Encryption Practices
Encryption is pivotal in safeguarding data as it protects sensitive information from unauthorized access, theft, and interception. At Dataddo, we use a suite of encryption techniques:
Data "in Transit"
Data "in transit" refers to data that is currently moving between two systems over a network.
To protect this data from potential eavesdropping or interception, we use Transport Layer Security (TLS) encryption. TLS encryption provides end-to-end encryption between the sender and recipient, meaning that the data is protected from any unauthorized access during transmission.
Data "at Rest"
Data "at rest" refers to data that is stored on a device or system.
To secure this data, we use Advanced Encryption Standard (AES) 256 encryption which is considered to be highly secure and is widely used for protecting sensitive information. The encryption keys are managed by a third-party service such as
- Amazon Web Services (AWS) Key Management Service (KMS),
- Google Cloud Key Management Service, or
- Azure Key Vault.
The keys themselves are protected by a third-party Hardware Security Module (HSM)-backed key management service, which stores and manages the keys in a secure environment designed to protect against attacks and unauthorized access. This ensures that your data is protected even if the underlying systems are compromised.
The security of the encryption process is further enhanced by the separation of duties between the data owner and the key custodian. The data owner retains control over their encrypted data, while the key custodian (key management service) retains control over the encryption keys. This means that there is no single point of failure and that both the data and the encryption keys are kept secure.
Credential Protection
Credentials, such as third-party services login information, are often some of the most sensitive pieces of data in an organization. To protect credentials from unauthorized access, we encrypt them in the same way as data "at rest".
We also apply further network isolation to the systems where the credentials are stored, which involves:
- Separating the credential storage systems from other networked systems,
- Limiting access to the systems, and
- Applying additional security measures such as firewalls and intrusion detection systems.
Granular System Auditing
Granular system auditing enables administrators to track and analyze all system activities.
Reliability Assurance
Dataddo offers an industry-leading availability guarantee for all clusters used for production deployments.
Compliance Excellence
Dataddo regularly undergoes independent external verification for platform security, privacy, and compliance. Our growing commitment to standards and compliance supports your regulatory goals:
Prioritizing Privacy & Access Controls
Dataddo's primary focus is to protect your data's privacy. We've instituted logical controls and rigorous management processes to ensure only authorized personnel access your data.
Controlled Access
Our role-based access controls (RBAC) are meticulously designed, restricting system access to a small group of Dataddo reliability engineers. With multi-factor authentication (MFA) through a secure bastion host, every access is logged for auditing purposes.
Access Management
Senior management grants access to client data only during service reliability concerns. Access logs, permissions, and entitlements are regularly audited to ensure that access is granted only when necessary.
Comprehensive Auditing & Logging
Dataddo provides users with two types of logs to help them track activities within their accounts:
Account-Level Logs
At the account-level, Datado logs record all user management and authentication-related activities, such as
- User login/logout events
- Pssword reset activities
- Permission changes.
Access these logs by navigating to the Notifications page and switching to the Activity Log tab.
Action-Level Logs
At the action-level, Dataddo records all data integration activities performed by users, including
- Data extraction operations
- Data transformations
- Data loading operations
Users can view detailed information about each integration job, such as the source and destination systems, the time and duration of the job, and any errors or warnings encountered during the process. These logs help users troubleshoot any issues that arise during the data integration process and monitor the performance of their integrations over time.
Users can access action-level logs by clicking on the three dots button next to a data source or data flow and selecting Show Logs.