logo
#

Latest news with #UnityCatalog

PuppyGraph Announces New Native Integration to Support Databricks' Managed Iceberg Tables
PuppyGraph Announces New Native Integration to Support Databricks' Managed Iceberg Tables

Business Wire

time13-06-2025

  • Business
  • Business Wire

PuppyGraph Announces New Native Integration to Support Databricks' Managed Iceberg Tables

SAN FRANCISCO--(BUSINESS WIRE)--PuppyGraph, the first real-time, zero-ETL graph query engine, today announced native integration with Managed Iceberg Tables on the Databricks Data Intelligence Platform. This milestone allows organizations to run complex graph queries directly on Iceberg Tables governed by Unity Catalog- no data movement and no ETL pipelines. "Databricks' new Iceberg capabilities provide a truly open, scalable foundation. With PuppyGraph, teams can ask complex relationship-driven questions without ever leaving their lakehouse. " -- Weimo Liu, CEO of PuppyGraph Share Databricks Managed Iceberg Tables, launching in Public Preview at this year's Data + AI Summit, offers full support for the Apache Iceberg™ REST Catalog API. This allows external engines, such as Apache Spark™, Apache Flink™, and Apache Kafka™, to interoperate seamlessly with tables governed by Unity Catalog. Managed Iceberg Tables provide automatic performance optimizations, which deliver cost-efficient storage and lightning-fast queries out of the box. By combining PuppyGraph's in-place graph engine with the openness and scale of Managed Iceberg Tables, teams can now: Query massive Iceberg datasets as a live graph, in real-time Use graph traversal to detect fraud, lateral movement, and network paths Perform Root Cause Analysis on telemetry data using service relationship graphs Eliminate the need for ETL into siloed graph databases Scale analytics across petabytes with minimal operational overhead Coinbase and CipherOwl are joint customers of Databricks and PuppyGraph. At the Data + AI Summit, both will share how graph analytics has powered their products and enabled real-time insights directly on managed lakehouses. "This changes how graph analytics fits into the modern data stack," said Weimo Liu, CEO of PuppyGraph. "Databricks' new Iceberg capabilities provide a truly open, scalable foundation. With PuppyGraph, teams can ask complex relationship-driven questions without ever leaving their lakehouse." To learn more about how PuppyGraph integrates with Apache Iceberg™ and the Databricks Data Intelligence Platform, visit or see the joint talk with Coinbase at Data + AI Summit 2025. About PuppyGraph: PuppyGraph is the first and only real time, zero-ETL graph query engine in the market, empowering data teams to query existing relational data stores as a unified graph model deployed in under 10 minutes, bypassing traditional graph databases' cost, latency, and maintenance hurdles. Capable of scaling with petabytes of data and executing complex 10-hop queries in seconds, PuppyGraph supports use cases from enhancing LLMs with knowledge graphs to fraud detection, cybersecurity and more. Trusted by industry leaders, including Coinbase, Netskope, CipherOwl, Prevalent AI, Clarivate, and more. Learn more at and follow the company on LinkedIn, YouTube and X.

Fivetran awarded Databricks 2025 data integration partner of year
Fivetran awarded Databricks 2025 data integration partner of year

Techday NZ

time12-06-2025

  • Business
  • Techday NZ

Fivetran awarded Databricks 2025 data integration partner of year

Fivetran has been named the 2025 Databricks Data Integration Partner of the Year. The award recognises the collaborative efforts between Fivetran and Databricks to provide data foundations for analytics and artificial intelligence to enterprise customers. The acknowledgement comes in light of a 40 percent year-over-year increase in the number of joint customers using Fivetran and Databricks to manage and analyse data. Fivetran offers solutions that allow organisations to centralise data from a wide array of sources, such as SaaS applications, databases, files, and event streams, into the Databricks Data Intelligence Platform. By automating the process of moving data and streamlining pipeline management, Fivetran aims to lessen the engineering resources required by its clients while ensuring more reliable and faster access to data. Growth and integration The past year has seen the partnership between Fivetran and Databricks expand further, with the introduction of advanced integrations into Unity Catalog and Delta Lake. These integrations assist customers in maintaining governance requirements while making use of both structured and unstructured data. As more organisations look to refine their data operations, the combined capabilities of Fivetran and Databricks are cited as helping to reduce operational overhead, enhance performance, and expedite the transformation of raw data into actionable insights. "Databricks continues to be a strategic partner as more companies invest in modern data infrastructure. This recognition speaks to the value we are delivering together for customers who need reliable, secure data pipelines to support production-grade AI and analytics. We are proud to help build the foundation for what comes next." The above was stated by Logan Welley, Vice President of Alliances at Fivetran, underscoring the role of partnership in supporting enterprise clients adopting artificial intelligence and analytics-driven solutions. Launch partner initiatives Fivetran has also been announced as a launch partner for Databricks Managed Iceberg Tables. This new feature is designed to provide customers with access to open and high-performance data formats optimised for large scale analytics and artificial intelligence purposes. Through its integration with Unity Catalog, Fivetran seeks to offer enterprises a consistent approach to data governance and efficient data accessibility as they scale their workloads and expand use cases for analytics and AI. The solution is currently employed by a range of organisations across different industries. National Australia Bank, for example, uses Fivetran's Hybrid Deployment model to operate data pipelines within its own cloud infrastructure while utilising Databricks for processing and analytics. This structure allows the bank to adhere to stringent compliance requirements, whilst modernising its infrastructure and accelerating its artificial intelligence adoption efforts. Other companies, including OpenAI, Pfizer, and Dropbox, use Fivetran to facilitate data transfer into Databricks to support a variety of applications, from real-time analytics to machine learning in production settings. The goal for these organisations is to improve operational speed and inform decision-making processes. Partner perspectives "As enterprise demand for data intelligence grows, Fivetran has been an important partner for us in helping organisations move faster with data. Their focus on automation, scale, and governance aligns with what our customers need as they bring more data-driven AI applications from production to market." This statement was made by Roger Murff, Vice President of Technology Partners at Databricks, highlighting the significance of the partnership in meeting evolving customer needs in the data intelligence sector. Fivetran reports that its automated pipelines, security measures, and managed experience are intended to support compliance and facilitate AI-focused data infrastructure modernisation for its enterprise clients.

Databricks Unveils Lakeflow Designer for Data Analysts to Build Reliable Pipelines Without Coding
Databricks Unveils Lakeflow Designer for Data Analysts to Build Reliable Pipelines Without Coding

Cision Canada

time11-06-2025

  • Business
  • Cision Canada

Databricks Unveils Lakeflow Designer for Data Analysts to Build Reliable Pipelines Without Coding

New Lakeflow Designer offers drag-and-drop interface to generate production pipelines; Lakeflow now Generally Available SAN FRANCISCO, June 11, 2025 /CNW/ -- Data + AI Summit — Databricks, the Data and AI company, today announced the upcoming Preview of Lakeflow Designer. This new no-code ETL capability lets non-technical users author production data pipelines using a visual drag-and-drop interface and a natural language GenAI assistant. Lakeflow Designer is backed by Lakeflow, the unified solution for data engineers to build reliable data pipelines faster with all business-critical data, which is now Generally Available. Traditionally, enterprises have faced a big tradeoff — either letting analysts create pipelines with no-code/low-code tools, while sacrificing governance, scalability and reliability. Or, they've relied on technical data engineering teams to code production-ready pipelines, but those teams are overloaded, and their backlog is long. In the end, most enterprises adopt a combination of both approaches, resulting in complex environments to manage and maintain. What data-driven enterprises really want is the best of both worlds: no code pipelines with governance, scalability and reliability. "There's a lot of pressure for organizations to scale their AI efforts. Getting high-quality data to the right places accelerates the path to building intelligent applications," said Ali Ghodsi, Co-founder and CEO at Databricks. "Lakeflow Designer makes it possible for more people in an organization to create production pipelines so teams can move from idea to impact faster." Lakeflow Designer: AI-Native Drag-and-Drop Data Prep for the Business Analyst The new Lakeflow Designer empowers business analysts to build no-code ETL pipelines with natural language and a drag-and-drop UI that provides the same scalability, governance, and maintainability as those built by data engineers. Backed by Lakeflow, Unity Catalog, and Databricks Assistant, Lakeflow Designer eliminates the divide between code and no-code tools. With this new approach, non-technical users gain the speed and flexibility they require to solve business problems without burdening data engineers with maintenance issues and governance headaches. Additional Lakeflow Capabilities Launching Lakeflow Enters GA: Today, Lakeflow became generally available, providing a unified data engineering solution from ingestion to transformation and orchestration. Notably, the new Lakeflow Declarative Pipelines capabilities allow data engineers to build end-to-end production pipelines in SQL or Python without having to manage infrastructure. New IDE for Data Engineering: Lakeflow is debuting a brand new development experience that speeds up data pipeline development with AI-assisted coding, debugging and validation in an integrated UI. New Ingestion Connectors: New point-and-click ingestion connectors for Lakeflow Connect are launching for Google Analytics, ServiceNow, SQL Server, SharePoint, PostgreSQL, and SFTP, joining connectors for Salesforce Platform and Workday Reports that are already available. Direct Write to Unity Catalog with Zerobus: Zerobus enables developers to write high volumes of event data with near real-time latency to their lakehouse without the need to manage extra infrastructure like a message bus. This streamlined, serverless infrastructure provides performance at scale for IoT events, clickstream data, telemetry and other event-driven use cases. Customer Momentum "The new editor brings everything into one place — code, pipeline graph, results, configuration, and troubleshooting. No more juggling browser tabs or losing context. Development feels more focused and efficient. I can directly see the impact of each code change. One click takes me to the exact error line, which makes debugging faster. Everything connects — code to data; code to tables; tables to the code. Switching between pipelines is easy, and features like auto-configured utility folders remove complexity. This feels like the way pipeline development should work." — Chris Sharratt, Data Engineer, Rolls-Royce "Using the Salesforce connector from Lakeflow Connect helps us close a critical gap for Porsche from the business side on ease of use and price. On the customer side, we're able to create a completely new customer experience that strengthens the bond between Porsche and the customer with a unified and not fragmented customer journey," said Lucas Salzburger, Project Manager, Porsche Holding Salzburg "Joby is able to use our manufacturing agents with Lakeflow Connect Zerobus to push gigabytes a minute of telemetry data directly to our lakehouse, accelerating the time to insights — all with Databricks Lakeflow and the Data Intelligence Platform." – Dominik Müller, Factory Systems Lead, Joby Aviation Availability At Data + AI Summit, Databricks is launching Lakeflow into General Availability. The new IDE for data engineering is entering Public Preview, new ingestion connectors are launching across various release states and Zerobus is entering Private Preview. Lakeflow Designer will be entering Private Preview shortly after Data + AI Summit. About Databricks Databricks is the Data and AI company. More than 15,000 organizations worldwide — including Block, Comcast, Condé Nast, Rivian, Shell and over 60% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to take control of their data and put it to work with AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake, MLflow, and Unity Catalog. To learn more, follow Databricks on X, LinkedIn and Facebook.

Databricks Eliminates Table Format Lock-in and Adds Capabilities for Business Users with Unity Catalog Advancements
Databricks Eliminates Table Format Lock-in and Adds Capabilities for Business Users with Unity Catalog Advancements

Cision Canada

time11-06-2025

  • Business
  • Cision Canada

Databricks Eliminates Table Format Lock-in and Adds Capabilities for Business Users with Unity Catalog Advancements

Unity Catalog is now the most complete catalog for Apache Iceberg™ and Delta Lake, enabling open interoperability with governance across compute engines, and adds unified semantics and a rich discovery experience for business users SAN FRANCISCO, June 11, 2025 /CNW/ -- Data + AI Summit -- Databricks, the Data and AI company, today extends its leadership in the unified governance category with powerful new capabilities. Unity Catalog adds full support for Apache Iceberg™ tables, including native support for the Apache Iceberg REST Catalog APIs. Now, Unity Catalog is the only catalog that enables external engines to read and write, with fine-grained governance, to performance-optimized, Iceberg managed tables, eliminating lock-in and enabling seamless interoperability. Databricks is also introducing two new enhancements that extend Unity Catalog to business users. Business metrics and KPIs are the foundation of how companies manage their business, and can now be defined as first-class data assets with Unity Catalog Metrics. In addition, data + AI discovery is enhanced for business users with a new, curated internal marketplace that surfaces the highest-value data, AI and AI/BI assets, organized by business domain. All these assets are augmented with automated data intelligence, so every team can find, trust and act on the right data. Unity Catalog Now Eliminates the Need to Choose Between Formats Built on open standards, Unity Catalog is designed to work across every table format and engine. Databricks is now taking that vision further with the Public Preview of full Apache Iceberg support, uniting the Apache Iceberg and Delta Lake ecosystems with a single approach to governance. The preview adds three new capabilities. First, organizations can create Apache Iceberg managed tables that any Iceberg‑compatible engine can read and write through Unity Catalog's Iceberg REST Catalog API. These Iceberg managed tables benefit from the full power of Unity Catalog: best price performance with AI-powered Predictive Optimization; and unified governance and policy enforcement both within Databricks and across external engines, including Trino, Snowflake, Amazon EMR, etc. Second, Unity Catalog's pioneering Lakehouse Federation capabilities enable seamless access to Iceberg tables managed in external catalogs, so those tables can be discovered and governed alongside native tables. Third, Iceberg tables get all the benefits of the Delta Sharing ecosystem, including seamless cross-organizational sharing of Iceberg tables. These capabilities eliminate format-driven data silos — no other catalog in the industry provides these capabilities. A Growing Disconnect Between Data Platforms and Business Users While data platforms have advanced rapidly for technical users, teams across the business remain disconnected from the systems that power their decisions. Technical teams center their world on tables, files, compute and code, while business users operate in BI tools, AI chatbots and focus on KPIs and business metrics in their business domains. These fundamentally different languages result in business users unsure of what data to trust or reliant on engineers for basic questions. Without a unified foundation for business context, organizations face duplicated work, decision paralysis and a persistent gap between data and action. A Single Source of Truth for Metrics Across the Business To address this need, Unity Catalog Metrics brings business metric definitions traditionally embedded within BI tools to the data platform. This creates consistency and accuracy for how everyone in the organization understands business performance. Unlike proprietary Bl semantic layers, Unity Catalog Metrics are fully addressable via SQL to ensure that everyone in the organization can have the same view of metrics, irrespective of what tool they choose. Unity Catalog Metrics is available to all customers today as a Public Preview and will be Generally Available later this summer. A Unified Foundation for Context: From Guided Discovery to Intelligent Insights To make trusted data truly usable for business users, Databricks is introducing new Unity Catalog capabilities that blend intuitive discovery with built-in intelligence. A new Discover experience offers a curated internal marketplace of certified data products — organized by business domains like Sales, Marketing, or Finance and enriched with documentation, ownership, tagging and usage insights. Automated, intelligent recommendations coupled with data steward curation tools ensure the highest value assets - metrics, dashboards, tables, AI agents, Genie spaces, etc. - can easily be explored, understood, trusted, and accessed through a self-serve workflow, without manual approvals or engineering support. Unity Catalog Discover is now in Private Preview. Unity Catalog also now adds intelligence across the experience, surfacing data quality signals, usage patterns, relationships across assets, and certification and deprecation status to help users quickly assess trust and relevance. With Databricks Assistant built into Unity Catalog, they can ask natural language questions and get grounded, contextual answers based on governed metrics — turning discovery into a guided journey where data is accessible, explainable, trustworthy, and ready for use. "We created the Unified Governance category with Unity Catalog four years ago," said Matei Zaharia, Co-founder and CTO of Databricks. "With these updates to Unity Catalog, we are now offering the best catalog in the industry for Apache Iceberg and all open table formats, and the only one that allows reads and writes to managed tables from external engines, for a truly open enterprise catalog. No matter what table format our customers choose, we ensure it's accessible, optimized, and governed. And with our expanded focus on business users, we're ensuring we deliver on the promise of democratizing data + AI to every user in the enterprise." Customer + Partner Quotes "At Riskified, we want to store all our data in an open format and want a single catalog that can connect to all the tools we use," said Hen Ben-Hemo, Data Platform Architect at Riskified. "Unity Catalog allows us to write Iceberg tables that are fully open to any Iceberg client, unlocking the entire lakehouse ecosystem and future proofing our architecture." "Unity Catalog Metrics gives us a central place to define business KPIs and standardize semantics across teams, ensuring everyone works from the same trusted definitions across dashboards, SQL, and AI applications." — Richard Masters, Vice President, Data & AI, Virgin Atlantic "Unity Catalog Metrics presents an exciting opportunity to establish consistency, trust, and control in how business metrics are defined and consumed across Zalando. It is a promising contribution to aligned, data-driven decisions across our BI dashboards, notebooks, and other tools." — Timur Yuere, Engineering Manager, Zalando "Unity Catalog Metrics represents an exciting opportunity for Tableau customers to leverage the value of centralized governance with Databricks Unity Catalog. Through our deep integration and expanding roadmap with Databricks, we're thrilled to help remove the friction for our customers in leveraging Databricks to define their core business metrics" - Nicolas Brisoux, Sr. Director Product Management Tableau "We're excited to partner with Databricks to integrate Unity Catalog Metrics into Sigma. This gives business teams direct access to trusted, standardized business metrics within their dashboards, so everyone can make decisions based on consistent definitions, without relying on data teams for every question." — Dillion Morrison, VP of Product, Sigma Computing Availability Databricks is introducing Public Preview of full Apache Iceberg support in Unity Catalog. Unity Catalog Metrics is available to all customers today as a Public Preview and will be Generally Available later this summer. Unity Catalog Discover is now in Private Preview. About Databricks Databricks is the Data and AI company. More than 15,000 organizations worldwide — including Block, Comcast, Condé Nast, Rivian, Shell and over 60% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to take control of their data and put it to work with AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake, MLflow, and Unity Catalog. To learn more, follow Databricks on X, LinkedIn and Facebook.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store