- 5 Minutes to read
Security and Compliance
- 5 Minutes to read
Thousands of organizations, from startups to industry leaders, entrust Dataddo with their data. As such, Dataddo is committed to protecting customer data and upholding transparency with regards to data usage.
We have implemented a robust set of security processes and controls to ensure that customer data remains safe and secure at all times. Our security measures include regular audits and vulnerability assessments, as well as multi-factor authentication and encryption of sensitive data.
In addition, we adhere to the highest levels of standards conformance and regulatory compliance, including GDPR and SOC 2 Type II certifications. We recognize the importance of meeting the most demanding security and privacy requirements of our customers, and will continue to improve our security practices to ensure that customer data remains protected.
Dataddo offers enterprise-level security features to help you set up rigorous controls for all your data integration workloads.
Network Isolation and Access
Our underlying systems are fully isolated in a private network that is inaccessible from the outside environment. We use a range of access control measures to ensure that only a small group of Dataddo reliability engineers have access to the systems, including background checks, regular training, and continuous monitoring of access logs.
Access to the systems requires multi-factor authentication (MFA) through a secure bastion host, which acts as a gateway to the private network. The bastion host uses strict access controls and logging mechanisms to ensure that all actions are tracked and auditable. By implementing these measures, we help to ensure the security and confidentiality of your data at all times.
Encryption is a crucial component of data security, as it helps to protect sensitive information from unauthorized access, theft, and interception. At Dataddo, we employ a range of encryption measures to ensure that your data is always kept safe and secure.
Data "in Transit"
Data "in transit" refers to data that is currently moving between two systems over a network. To protect this data from potential eavesdropping or interception, we use Transport Layer Security (TLS) encryption. TLS encryption provides end-to-end encryption between the sender and recipient, meaning that the data is protected from any unauthorized access during transmission.
Data "at Rest"
Data "at rest" refers to data that is stored on a device or system. To secure this data, we use Advanced Encryption Standard (AES) 256 encryption. This encryption standard is considered to be highly secure and is widely used for protecting sensitive information. The encryption keys are managed by a third-party service such as Amazon Web Services (AWS) Key Management Service (KMS), Google Cloud Key Management Service, or Azure Key Vault. The keys themselves are protected by a third-party Hardware Security Module (HSM)-backed key management service, which stores and manages the keys in a secure environment designed to protect against attacks and unauthorized access. This ensures that your data is protected even if the underlying systems are compromised.
The security of the encryption process is further enhanced by the separation of duties between the data owner and the key custodian. The data owner retains control over their encrypted data, while the key custodian (key management service) retains control over the encryption keys. This means that there is no single point of failure and that both the data and the encryption keys are kept secure.
Credentials, such as login information for third-party services, are often some of the most sensitive pieces of data in an organization. To protect these credentials from unauthorized access, we encrypt them in the same way as data "at rest".
We also apply further network isolation to the systems where the credentials are stored, which involves
- separating the credential storage systems from other networked systems,
- limiting access to the systems,
- and applying additional security measures such as firewalls and intrusion detection systems.
Granular System Auditing
Granular system auditing in Dataddo allows administrators to answer detailed questions about systems activity by tracking all activity in their accounts.
Dataddo offers an industry-leading availability guarantee for all clusters used for production deployments.
Dataddo regularly undergoes independent verification of platform security, privacy, and compliance controls. Our strong and growing focus on standards conformance and compliance will help you meet your regulatory and policy objectives.
Privacy and Data Access Controls
At Dataddo, protecting the privacy of your data is our top priority. We ensure that only authorized personnel can access your data, and we have implemented logical controls and management processes to tightly restrict and monitor access.
To ensure the security of your data, we use role-based access controls (RBAC) that limit system access to a small group of Dataddo reliability engineers. To further secure access, we require multi-factor authentication (MFA) through a secure bastion host, and we log all actions for auditing purposes.
Access to client data is granted only by senior management during service reliability issues. We regularly audit access logs, permissions, and entitlements to ensure that access is granted only when necessary.
Auditing and Logging
Dataddo provides users with two types of logs to help them track activities within their accounts: account-level logs and action-level logs.
At the account-level, Datado logs record all user management and authentication-related activities, such as user login/logout events, password reset activities, and permission changes. Users can access these logs by navigating to Notifications → Activity logs.
Account-level logs allow users to manage access and security within their account effectively.
At the action-level, Dataddo records all data integration activities performed by users, including data extraction, transformation, and loading operations. Users can view detailed information about each integration job, such as the source and destination systems, the time and duration of the job, and any errors or warnings encountered during the process. These logs help users troubleshoot any issues that arise during the data integration process and monitor the performance of their integrations over time.
Users can access action-level logs by clicking on the three dots next to a source, or flow and selecting Show Logs.