Confluent introduces data security and access controls

With more granular data governance and enterprise-grade security, Confluent Cloud reduces the risk of Kafka breaches and downtime, enabling pervasive event streaming that is easier and more secure.

  • 3 years ago Posted in
Confluent has introduced new capabilities in Confluent Cloud that protect event data from threats and vulnerabilities across the entire Apache Kafka® ecosystem. This supports the latest Project Metamorphosis theme – Secure – and ensures the next generation of event streaming applications can be put at the heart of every business while avoiding costly downtime and risk.

 

The full value of event streaming is unlocked when it’s established as the central platform for event data across an entire business. This widespread use of Kafka makes it even more critical to ensure that the information it processes is properly secured and meets regulatory compliance. But securing Kafka at scale is complex and doing it wrong could cost thousands of dollars per minute in downtime and increases the risk of data breaches. Through Project Metamorphosis, Confluent is adding new capabilities like role-based access control (RBAC), audit logs, customer control of encryption keys, and additional private networking connectivity options to Confluent Cloud to make it even easier to secure sensitive events and harness the full power of cloud-native Kafka.

 

“Event streaming’s real power lies in connecting all data to create a central nervous system for any real-time business,” said Jay Kreps, co-founder and CEO, Confluent. “However, the more data a platform can access, the more protection is needed to keep that data safe. With this release, Confluent Cloud is the best approach to safeguarding Apache Kafka, removing yet another barrier to pervasive event streaming.” 

 

Confluent makes event streaming secure by default

 

More is required of Kafka as it is implemented in thousands of mission-critical applications. It is used by more than 80% of Fortune 100 companies and across highly regulated industries, like financial services, manufacturing, and healthcare, to harness the power of real-time events for valuable business outcomes and improved customer experiences. Confluent is building the next-generation event streaming platform that any company, no matter what industry, can put at the heart of their business. With these new capabilities and built-in security across the entire Kafka ecosystem, Confluent cuts down the time and effort typically needed to keep Kafka secure and compliant. Now any organisation can swiftly make event streaming central to their business.

 

Protect Data Across the Entire Kafka Ecosystem with Granular Access Controls and Real-Time Monitoring

      Role-based access control (RBAC) – In order to democratise data and event streams across a company, access controls are needed for both users and applications, so it’s clear which individuals can access specific data. RBAC in Confluent Cloud establishes scalable, centralised roles and privilege definitions to not just Kafka, but also Schema Registry, Kafka Connect, and ksqlDB.

 

      Audit logs – With more individuals in a company using Kafka, it gets harder to keep track of who is attempting to access event data and the actions being taken with Kafka topics. Audit logs provide real-time insights into platform authorisation logs and other security events, so action can quickly be taken to deal with any suspicious activity.

 

Role-based access control and audit logs are available as a preview in Confluent Cloud and generally available in Confluent Platform.

 

Minimise Risk of Security Breaches with Built-In Provisioning Guardrails That Don’t Slow App Development

      Private networking connectivity – Applications that need access to Kafka are likely spread across many networks. Connecting them all to Confluent Cloud with traditional VPC peering can be challenging and requires a lot of back and forth between teams and between the peering parties. New private networking connectivity reduces the operational burden on teams to set up and maintain platform security and securely transfer data between managed service and clients. Available now for Confluent Cloud Dedicated clusters via AWS PrivateLink in addition to existing support for Amazon Transit Gateway and VPC/Vnet peering on all clouds.

 

      Encryption with customer managed keys (BYOK) – For government, health, finance, and many other industries that require a greater degree of privacy and data control, ensuring only the appropriate individuals or third parties can access certain data is a must. In order for any company to be able to adopt pervasive event streaming, this new capability enables customers to manage their own encryption keys for at-rest data on Dedicated clusters using their cloud provider's native key management service. It’s now possible to encrypt at-rest cluster data with an AWS-generated encryption key so only the appropriate entity or user can decrypt it.
Precise, AI-powered insights enable DevOps and SRE teams to proactively optimize cloud-native...
Sumo Logic has introduced new integrations with CircleCI and GitLab designed to help development...
High-speed application development platform underpins ‘MyWorkSpace’ App and safe return to the...
After Kubernetes Kosmos and S3-based Object Storage, Scaleway continues to deliver on its Multi...
CloudBees has been selected by HSBC as its software delivery platform provider, supporting the bank...
Latest Akamai Security Research examines global API security landscape; reveals 2020-2021 attack...
DevOps and SRE practices are critical to high-quality, efficient releases, but teams still devote...
Canonical has released Ubuntu 21.10 - the most productive environment for cloud-native developers...