Confluent, Inc. (NASDAQ: CFLT), the data streaming pioneer, announced Stream Designer, a visual interface that enables developers to build and deploy streaming data pipelines in minutes. This point-and-click visual builder is a major advancement toward democratizing data streams so they are accessible to developers beyond specialized Apache Kafka experts. With more teams able to rapidly build and iterate on streaming pipelines, organizations can quickly connect more data throughout their business for agile development and better, faster, in-the-moment decision making.We are in the middle of a major technological shift, where data streaming is making real time the new normal, enabling new business models, better customer experiences, and more efficient operations, said Jay Kreps, Cofounder and CEO, Confluent. With Stream Designer we want to democratize this movement towards data streaming and make real time the default for all data flow in an organization.
In the streaming era, data streaming is the default mode of data operations for successful modern businesses. The streaming technologies that were once at the edges have become core to critical business functions. This shift is fueled by the growing demand to deliver data instantaneously and scalably across a full range of customer experiences and business operations. Traditional batch processing can no longer keep pace with the growing number of use cases that depend on sub-millisecond updates across an ever-expansive set of data sources.
Organizations are seeking ways to accelerate their data streaming initiatives as more of their business is operating in real time. Kafka is the de facto standard for data streaming, as it enables over 80% of Fortune 100 companies to reliably handle large volumes and varieties of data in real time. However, building streaming data pipelines on open-source Kafka requires large teams of highly specialized engineering talent and time-consuming development spread across multiple tools. This puts pervasive data streaming out of reach for many organizations and leaves existing legacy pipelines clogged with stale and outdated data.