Azure – Public preview: Azure Functions .NET worker cold start improvements
Major cold start improvements are available to .NET apps on the Azure Functions isolated worker model through a preview configuration.
Read More for the details.
Major cold start improvements are available to .NET apps on the Azure Functions isolated worker model through a preview configuration.
Read More for the details.
You can now run your apps on dedicated workload profiles that offer more CPU and memory if needed.
Read More for the details.
You can now define UDRs to manage how outbound traffic is routed for your container app environment’s subnet.
Read More for the details.
You can now run production workloads that perform a task for a finite duration and exit, and schedule these workloads or run them in response to events.
Read More for the details.
You can now run applications that accept TCP connections on multiple ports.
Read More for the details.
You can now encrypt traffic transmitted between applications within an environment using mTLS.
Read More for the details.
Azure Functions now supports .NET 8 preview 7, using the isolated worker model.
Read More for the details.
Starting today, AWS Application Migration Service (AWS MGN) supports additional application validation, configuration and modernization actions.
Read More for the details.
Amazon Relational Database Service (RDS) for PostgreSQL now supports the Rust programming language as a new trusted procedural language in PostgreSQL major versions 13 and 14, expanding support for Rust from major version 15. This helps you build high performance user-defined functions to extend PostgreSQL for compute-intensive data processing.
Read More for the details.
The new Azure Portal experience for Azure Database Migration Service (DMS) allows you to perform seamless migrations from SQL Server on-premises to Azure SQL targets using DMS.
Read More for the details.
Restore Azure Database for MySQL – Flexible Server to any Azure supported region.
Read More for the details.
General availability enhancements and updates released for Azure SQL in late-August 2023.
Read More for the details.
Azure Database for PostgreSQL – Flexible Server now supports minor versions 15.3, 14.8, 13.11, 12.15, 11.20.
Read More for the details.
As more organizations adopt open database standards, they need easy-to-use, high-performance migration tools, especially for heterogeneous migrations between different database engines. This week, we announced that Database Migration Service (DMS) has several new capabilities for converting and migrating Oracle databases to PostgreSQL.
DMS is a fully managed serverless cloud service that enables migration with minimal downtime and significantly reduces the complexity of migrating data, schema and SQL code. It provides capabilities to create initial snapshots faster, performs low-latency, non-intrusive change data capture, and with its parallel loading capability, reduces downtime.
Now generally available, support for Oracle to PostgreSQL migrations in DMS can help enterprises migrate data, schema and convert complex PL/SQL code to PostgreSQL-compatible SQL in an integrated and easy fashion. With this capability, DMS significantly improves the time-to-value for customers by providing a one-stop shop for data, schema and integrated code conversion. In addition, Database Migration Service offers Duet AI assisted code conversion, which helps with last-mile conversions. With Duet AI assisted code conversion, the system is designed in such a way that the conversion mechanism learns and evolves based on the code edits that have been performed. All user edits are tracked and recommendations are provided in real-time, giving you the flexibility to accept or reject them. This action is used again to refine the model and evolve recommendations, reducing future manual intervention.
You can sign up for the preview today for both Oracle to AlloyDB for PostgreSQL and Duet AI assisted code conversion on the DMS home page.
Let’s see how the new capabilities in DMS can help you migrate Oracle databases to Cloud SQL for PostgreSQL on Google Cloud.
Customers modernize their databases and leverage managed PostgreSQL to reduce costs, increase flexibility, improve performance, and avoid vendor lock-in by adopting widely accepted open-source standards. With AlloyDB, we combined the openness of PostgreSQL with capabilities to make it more than 4x faster for transactional workloads and deliver up to 100x faster analytical queries than standard PostgreSQL.
However, even with these obvious benefits, migrating off of legacy Oracle databases can be complex, requiring expertise in the databases and in their intricate code and application dependencies. Customers inevitably have to put in a lot of manual effort, leverage disparate and sometimes disconnected third-party tools, and call on partners to accomplish the modernization. Sometimes, the amount of manual effort is so high that it deters customers from even embarking on the journey, leaving them locked into an expensive legacy database, without the flexibility and innovation they need to succeed in their digital transformation.
In addition to the general availability mentioned above, this week at Google Cloud Next we launched a new integrated code conversion capability in Database Migration Service to convert database code, including stored procedures, functions, triggers, packages and custom PL/SQL code to PostgreSQL compatible SQL. Currently in preview, the code conversion capability not only does the syntactic conversion, but it also does the semantic conversion to ensure that the end result of the SQL is correct. Because it understands both the source and destination dialects, this new capability minimizes the time it takes to perform a migration (code conversion part is always the longest pole of the migration process). With DMS, both code and schema conversion are done automatically.
Since we announced the preview launch of Oracle to PostgreSQL schema and data migration in DMS last year, we’ve helped many organizations migrate their on-premises workloads to Google Cloud, and they’ve reported the following benefits:
Simple and intuitive user experience enabling users to configure, manage and monitor complex migrations without having to learn the complex database related technology.
High-speed initial snapshot capability to copy the data from Oracle into Cloud SQL for PostgreSQL or AlloyDB for PostgreSQL Replication and Change Data Capture (CDC) capabilities to keep the target database in sync to reduce the downtime.
Automatic transition from initial snapshot to Change Data Capture to prevent any data gaps that might occur due to manual configuration.
Persisted and reusable connection management to store connection profiles along with credentials in a secure manner.
Integrated code conversion to convert all the complex PL/SQL code to Cloud SQL for PostgreSQL compatible SQL via conversion workspace.
Seamless transition between source conversion to data movement.
Migrating is easy with DMS. To start, navigate to the Database Migration page in the Google Cloud console, create a new migration job, and take these five simple steps:
Create your source connection profile, which contains information about the source database. The connection profile can later be used for additional migrations.
Create aconversion workspace that automatically converts your source Oracle schema and the PL/SQL code to a PostgreSQL schema and compatible SQL.
Apply the converted schema objects and SQL code on your destination Cloud SQL for PostgreSQL instance.
Create a migration job and choose the conversion workspace and connection profiles previously created.
Test your migration job and get started whenever you’re ready.
Once the migration job starts, DMS takes an initial snapshot of your data, then replicates new changes as they happen. The migration job will continue to replicate the source data until you decide to finalize the migration.
For more information to help get you started on your migration journey, head over to the documentation or start training with this Database Migration Service Qwiklab. In addition, you can view this conversion workspace introduction video. To get started with the new Database Migration Service for Oracle to PostgreSQL migrations, simply visit the database migration page in the console.
Read More for the details.
Migrating off expensive, legacy databases, wherever they may be, remains a top priority for enterprises. Last year, we announced the general availability of AlloyDB, a fully-managed, PostgreSQL-compatible database service that provides a powerful option for modernizing your most demanding enterprise database workloads, and for scaling existing PostgreSQL workloads without having to change the application. Then, earlier this year, we launched the technology preview of AlloyDB Omni, a downloadable edition of AlloyDB designed to run on-premises, at the edge, across clouds, or even on developer laptops. With the AlloyDB Omni technology preview, customers explored how to meet requirements for enterprise-grade PostgreSQL in their own environments, while also helping us improve the product as we progressed to general availability.
During the technology preview, customers asked for guidance on data protection (backups, replication, HA/DR) and suitable platforms, integration with more extensions, and access to Google’s advanced AI capabilities. To help accelerate migrations off of expensive legacy databases, customers also asked us to provide simple and affordable pricing for AlloyDB Omni.
This week at Google Cloud Next, we announced the public preview of AlloyDB Omni. For the public preview, we’ve added support for AlloyDB AI to make it easy to rapidly build enterprise generative AI applications, and published detailed technical guides to achieve high levels of data protection using a combination of open-source tools and Google-supported configurations. We also added compatibility with PostgreSQL 15, included utilities and a CLI to make it easier to manage AlloyDB Omni databases, and made it easier to communicate configuration information to Google support when help is needed. In addition, we announced support for AlloyDB Omni as another choice in Database Service for Google Distributed Cloud Hosted, our air-gapped private cloud for regulated customers. The public preview is available for download today at https://cloud.google.com/alloydb/omni.
If you’re ready to try AlloyDB Omni, read more about it in the documentation, and complete this signup form to get access to the technology preview. As part of the download process, you will also accept the Terms of Service. Note that AlloyDB Omni is not yet suitable for production use. We look forward to your feedback!
Read More for the details.
Redis has continued to grow in popularity, whether that’s to reduce load on a database to save money, improve the end user experience with lower latency, or simply because developers love it. However, in industries like gaming (leaderboards, session stores), finance (fraud detection), ads (ultra-low-latency serving), or retail (checkouts), workloads push the boundaries of a standalone Redis shard and require the performance and horizontal scale of Redis Cluster.
To serve these demanding Redis workloads, we announced the preview of Memorystore for Redis Cluster at Google Cloud Next. Compared to the existing service, Memorystore for Redis Cluster provides 60 times more throughput with microseconds latencies and supports 10 times more data. This new fully-managed service for Redis Cluster is fully open–source compatible, easy to set up, and provides zero-downtime scaling with ultra-low and predictable latency. With Memorystore for Redis Cluster, you get intelligent and automatic zonal distribution of nodes for high-availability and resilience, automated replica management and promotion, and a 99.99% availability SLA upon our General Availability launch.
Customers depend on Memorystore for its low latency and reliability. Now, with Memorystore for Redis Cluster, we’ve taken scale and performance to new heights. With a single mouse click or gcloud command, you can now easily scale your clusters with zero downtime to terabytes of keyspace, and up to 60 times the throughput of Memorystore for Redis. Best of all, we’ve made scaling improvements directly to the Redis engine to provide an improved and differentiated Redis scaling experience.
Verve Group is an advertising technology ecosystem that relies on ultra-low latency Redis Clusters to deliver real-time ads, and recently tested Memorystore for Redis Cluster. In addition to ease of management, Verve experienced first-hand the scaling improvements we made to the Redis engine to improve slot migrations, de-risk scaling, and directly address the data corruption risks present in open-source Redis Cluster scaling.
“Memorystore for Redis Cluster exceeded our performance expectations. We observed extremely high throughput with very low latency. We were also able to scale down to 10 nodes on the fly, illustrating the ease of zero-downtime scalability of the product.” – Ville Lamminen, DevOps manager, Verve Group
Lowering Total Cost of Ownership (TCO) and saving you money while reducing operational overhead is fundamental to the value proposition of Memorystore for Redis Cluster. With this new service, you no longer have to self-manage tens or hundreds of Redis nodes on Compute Engine, worrying about high toil and tedious tasks like zonal distribution for high-availability, VM failures, Redis tuning, replica management, and complex scaling operations. Instead, with Memorystore for Redis Cluster’s 10x scale, you can consolidate smaller workloads into a highly performant and resilient cluster to drive cost savings and reduce operational overhead. And with its pay as you go (PAYG) model, you can easily take advantage of zero-downtime scaling to meet the demands of your workloads, transforming events like Black Friday/Cyber Monday into a simple click of the mouse.
“With Memorystore for Redis Cluster, we can easily scale our clusters to meet the demands of our largest workloads. Best of all, we’re able to offload the complexities of cluster management to Google instead of self-managing on Compute Engine and requiring in-house Redis Cluster expertise. Our testing of Memorystore for Redis Cluster reveals this new product to be easy to use, highly performant, and easily scalable.” – Animesh Chaturvedi, distinguished engineer, Palo Alto Networks
While conceptually simple, the actual practice of self-managing node and replica placement across availability zones can create significant engineering toil and easily result in costly mistakes. With Memorystore for Redis Cluster, we automatically distribute Redis Cluster nodes across zones to achieve maximum availability and resiliency. When you add Replica nodes, we automatically place them in a different zone than the primary, adding resiliency to the cluster and protecting the data from a zonal outage.
We also take responsibility for provisioning and configuring infrastructure for optimal performance and the lowest possible latency. Behind the scenes, we manage the software, security patches, detect failures and trigger automatic failovers and VM replacements, freeing up engineering organizations to focus on building their applications, rather than managing open source software and Compute Engine infrastructure.
To simplify your connectivity experience and provide you with a first class managed experience, we’re launching Memorystore for Redis Cluster with Private Service Connect (PSC). PSC offers a private and secure default connection option that ensures traffic never leaves Google’s network. With PSC, Memorystore for Redis Cluster provides a simple onboarding experience with a single-step cluster provisioning process, allowing you to configure private networking without needing to be a networking expert. PSC provides granular connectivity with minimal IP consumption (just two IP addresses per Cluster, even for more than 100 shards). PSC also addresses Virtual Private Cloud (VPC) peering limitations, is accessible from any region, and provides advanced security controls.
GET Started, SET up your cluster
Memorystore for Redis Cluster is available today in preview and we encourage you to experience how it can help you achieve superior scale and performance (link to Google Cloud console). Please visit our documentation to learn more about additional features, such as the integration with Identity and Access Management (IAM), Cloud Monitoring, Audit Logging, and how to enable in-transit encryption. Please don’t hesitate to reach out with questions or feedback to cloud-memorystore-pm@google.com.
Read More for the details.
Businesses across industries are increasingly combining their data with external sources or collaborating directly with partners to gain new insights and unlock business value. As more data is being shared within and across organizational boundaries, the security and privacy of data — especially sensitive data — is more important than ever.
To help organizations solve these challenges, we announced that data clean rooms were coming to BigQuery to help organizations create and manage secure environments for privacy-centric data sharing and analysis. BigQuery is a serverless and cost-effective enterprise data warehouse that works across clouds and scales with the data needs of the business, with built-in AI/ML and business intelligence for insights. With enterprise-grade protection and mission-critical workloads, BigQuery data clean rooms can offer advanced security and privacy controls to help ensure that your teams can conduct meaningful analyses while protecting the underlying data.
Today, BigQuery data clean rooms are now available in preview across all Analytics Hub regions, the data sharing platform within BigQuery that enables data sharing. With this capability, you can now:
Create and deploy clean rooms in a few clicks, collaborating across organizational boundaries, without needing to move or copy the underlying data in BigQuery
Perform n-way joins, including enriching your existing data with thousands of public and commercial datasets available in Analytics Hub
Use analysis rules and privacy policies to help protect your sensitive data
Leverage the same analysis rules as leaders in the data clean room space, such as aggregation thresholding to help protect sensitive data
Prevent your data and query results from being copied or exported
Manage, track, and govern usage of shared data in clean rooms in a single pane of glass
Data clean rooms have many use cases across various industries. In the financial sector, sensitive data from various sources can be merged within clean rooms, leading to improved fraud detection and the creation of accurate credit risk scores. For retailers, point-of-sale data can be combined with marketing information, enabling more effective optimization of promotional strategies and data sharing with Consumer Packaged Goods suppliers. For publishers, data clean rooms help make it easier to securely share valuable data with advertisers that can be used to optimize advertising effectiveness.
Google Cloud customers like L’Oréal are excited to start using BigQuery data clean rooms. “We are intensively using Google Cloud for data analytics globally and are excited to be using BigQuery to build data clean rooms. In addition to saving time, money, and a highly reduced footprint because of data movement reduction and data sharing, we’re excited to get more value from the retailers that work with L’Oréal for a secure and streamlined data sharing experience that preserves user privacy.” – Antoine Castex, Data Platform Architect, L’Oréal
As clean rooms gain rapid traction across diverse verticals, marketers emerge as early and enthusiastic adopters, leveraging clean rooms for tasks such as campaign assessment. But as marketing sophistication increases, data can become difficult to use if it’s not in a clean room, due to the need to join various data sources — both owned sources and through partnerships — securely and with privacy protections.
Existing customers of Ads Data Hub for Marketers are already able to analyze their first-party data in BigQuery and campaign data to understand their end-customer’s path to purchase. Early next year, customers will also be able to connect with Ads Data Hub for Marketers through BigQuery Analytics Hub, alongside other BigQuery data clean rooms.
Ads Data Hub for Marketers will retain its rigorous privacy checks that help protect the personal data of users online while enabling comprehensive analytics.
Habu, a data collaboration platform for privacy-centric data orchestration, is launch partner for BigQuery data clean rooms. Customers like Disney Advertising use Google Cloud and Habu to help securely share data and surface insights to business users.
“Our partnership with Google Cloud exemplifies our commitment to deliver frictionless collaboration across an open technology ecosystem. Democratizing clean room access and delivering the privacy-centric tools that brands demand is opening up new avenues for growth for our shared customers.” – Matt Kilmartin, Co-founder and CEO, Habu
LiveRamp on Google Cloud can enable privacy-centric identity resolution within BigQuery to drive more effective data partnerships. LiveRamp’s solution in BigQuery unlocks the value of a customer’s first-party data and establishes a privacy-centric identity data model that can accurately:
Improve consolidation of both offline and online records for a more accurate and holistic view of customer audience profiles
Activate audiences securely with improved match rates for greater reach and measurement quality
Connect customer and prospect data with online media reports and partner data elements to help improve customer journeys and attribution insights using ML models
“LiveRamp has developed deep support for accurate and privacy-centric data connectivity throughout the Google ecosystem. As one of the first clean room providers on Google Cloud with our Data Collaboration Platform, we have been very excited to watch the evolution of Analytics Hub into a powerful, native clean room solution for the entire BigQuery ecosystem. We’re now helping global clients more easily collaborate in Analytics Hub with an ‘enterprise ID,’ an optimal key for connecting data and improving analytic accuracy between clean room partners. A purpose-built, first-party configurable identity graph — powered by LiveRamp’s proprietary third-party graph — drives greater connectivity, more flexible analytics for both individual and household levels of customer modeling, and ensures that sensitive customer data processing on BigQuery is performed safely.” – Max Parris, Head of Identity Resolution Products, LiveRamp
We are partnering with TransUnion to bring their TruAudience transfer-less Identity Resolution Service in BigQuery. TransUnion customers can execute identity resolution without data leaving the client’s BigQuery environment. The new integration can enable marketers to improve their data quality, get better insights, and collaborate in BigQuery data clean rooms in a privacy-first way. BigQuery’s powerful interoperability features help enable TransUnion to integrate their existing powerful machine-learning based entity resolution service with few code changes.
“A robust view of identity forms the basis for effective marketing analytics. We’re excited to build on BigQuery so that clients can enhance the performance of their advanced data analytics through identity resolution, enrichment, and deduplication. Our partnership can help marketers access data and collaborate with partners without sharing sensitive customer information.” – Ryan Engle VP of Identity Solutions, Credit Marketing, and Platform Integrations, TransUnion
Tumult Labs helps BigQuery customers implement differential privacy in clean rooms while minimizing compliance and privacy risk.
“We are thrilled to help BigQuery data clean rooms customers unlock new and valuable insights from their data without compromising on privacy. Our proven differential privacy platform provides rock-solid output controls in clean rooms, ensuring that aggregated data can be shared with rigorous guarantees of privacy protection. We look forward to building on our strategic partnership with BigQuery to provide even more customers easier access to world-class privacy technology.” – Gerome Miklau, Co-founder and CEO, Tumult Labs
Want to learn more about BigQuery data clean rooms? Sign up for the public preview today. For marketers who are looking to use a clean room to better understand Google and YouTube campaign performance and leverage their first party data in a privacy-centric way, consider Google’s advertising measurement solution Ads Data Hub for Marketers, built on BigQuery.
Read More for the details.
Organizations that are effective at using data and AI are more profitable than their competitors and see improved performance across a variety of business metrics, according to recent research: Already, 81% of organizations have increased their data and analytics investments over the previous two years. However, many organizations are still struggling to extract the full business value of data, with over 40% citing disparate analytics tools and data sources, and poor data quality as their biggest challenges.
Google Cloud is in a unique position to offer a unified, intelligent, open, and secure data and AI cloud for organizations. Thousands of customers across industries worldwide use Dataproc, Dataflow, BigQuery, BigLake and Vertex AI for data-to-AI workflows. Today, we are excited to announce BigQuery Studio — a unified, collaborative workspace for Google Cloud’s data analytics suite that helps accelerate data to AI workflows from data ingestion and preparation to analysis, exploration and visualization — all the way to ML training and inference. It allows data practitioners to:
Use SQL, Python, Spark or natural language directly within BigQuery and leverage those code assets easily across Vertex AI and other products for specialized workflows
Extend software development best practices such as CI/CD, version history and source control to data assets, enabling better collaboration
Uniformly enforce security policies and gain governance insights through data lineage, profiling and quality, right inside BigQuery
Disparate tools create inconsistent experiences for analytics professionals, requiring them to use multiple connectors for data ingestion, switch between coding languages, and transfer data assets between systems. This significantly impacts time-to-value of organizations’ data and AI investments.
BigQuery Studio addresses these challenges by bringing an end-to-end analytics experience in a single, purpose-built platform. It provides a unified workspace including a SQL and a notebook interface (powered by Colab Enterprise which is currently in preview), allowing data engineers, data analysts and data scientists to perform end-to-end tasks including data ingestion, pipeline creation, and predictive analytics, all using the coding language of their choice.
For example, analytics users like data scientists can now use Python in a familiar Colab notebook environment for data analysis and exploration at petabyte-scale right inside BigQuery. BigQuery Studio’s notebook environment supports browsing of datasets and schema, autocompletion of datasets and columns, and querying and transformation of data. Furthermore, the same Colab Enterprise notebook can be accessed in Vertex AI for ML workflows such as model training and customization, deployment, and MLOps.
Additionally, by leveraging BigLake with built-in support for Apache Parquet, Delta Lake and Apache Iceberg, BigQuery Studio provides a single pane of glass to work with structured, semi-structured and unstructured data of all formats across cloud environments such as Google Cloud, AWS, and Azure.
Shopify, a leading commerce platform, has been exploring how BigQuery Studio complements its existing BigQuery environment.
“Shopify has invested in employing a team with a diverse array of skill sets to remain ahead of trends for data science and engineering. In early testing with BigQuery Studio, we liked Google’s ability to connect different tools for different users within a simplified experience. We see this as an opportunity to reduce friction across our team without sacrificing scale we expect from BigQuery” – Zac Roberts, Data Engineering Manager, Shopify
BigQuery Studio improves collaboration among data practitioners by extending software development best practices such as CI/CD, version history and source control to analytics assets including SQL scripts, Python scripts, notebooks and SQL pipelines. Additionally, users will be able to securely connect with their favorite external code repositories, so that their code can never be out of sync.
In addition to enabling human collaborations, BigQuery Studio also provides an AI-powered collaborator for contextual chat and code assistance. Duet AI in BigQuery can understand the context of each user and their data, and uses it to auto-suggest functions, and code blocks for SQL and Python. Through the new chat interface, data practitioners can use natural language to get personalized real-time guidance on performing specific tasks, reducing the need for trial and error and searching documentation for a needle in a haystack.
BigQuery Studio lets organizations derive trusted insights from trusted data by helping users understand data, identify quality issues, and diagnose problems. Data practitioners can track data lineage, profile data, and enforce data-quality constraints to help ensure that data is high-quality, accurate, and reliable. Later this year, BigQuery Studio will surface personalized metadata insights like summaries of datasets or recommendations for how to derive deeper analysis.
Additionally, BigQuery Studio allows admins to uniformly enforce security policies for data assets by reducing the need to copy, move, or share data outside of BigQuery for advanced workflows. With unified credential management across BigQuery and Vertex AI, policies are enforced for fine-grained security without needing to manage additional external connections or service accounts. For example, using simple SQL in BigQuery, data analysts can now use Vertex AI’s foundational models for images, videos, text, and language translations for tasks like sentiment analysis and entity detection over BigQuery data without requiring to share data with third party services.
“Our data & analytics team is ceaselessly committed to staying ahead of the curve of data engineering and data science. During our initial trials with BigQuery Studio, we were particularly impressed by Google’s prowess in integrating diverse tools into a singular, streamlined experience. This fusion not only diminishes friction but also significantly amplifies our team’s efficiency, a testament to the power of BigQuery.” – Vinícius dos Santos Mello, Staff Data Engineer, Hurb
“As an early adopter of BigQuery Studio, we were impressed with its ability to not only minimize friction but also ensure robust data protection and centralization. The added support for Pandas DataFrames will further streamline our processes, saving valuable time for our team to collaborate and stay ahead of the curve.” – Sr. Director Analytics Engineering, Aritzia
“Duet AI in BigQuery has helped our data team at L’Oréal accelerate our transformation by making it easier for us to explore, understand, and use our data. With Duet AI, we can quickly query our data to get the insights we need to make better decisions for our business. We are excited to continue working with Duet AI to further our transformation and achieve our business goals.” – Antoine Castex, Data Platform Architect, L’Oréal
BigQuery Studio is now available for customers in preview. Check out the documentation to learn more and sign up to get started today.
Read More for the details.
Today, we are introducing the preview of Duet AI in Looker, tapping into the full power of Google and our years-long leadership in generative AI to reinvent how you work with and present your data. It’s time to reimagine what business intelligence truly means and go beyond traditional limits that have slowed you down, kept you searching for answers, or forced you down a path of busy work. In addition to these new gen AI capabilities, we are also executing on our vision to be the most open BI tool on the market with new industry partnerships that bring our customers a broader range of ways they can interact with and benefit from their business data.
The promise of modern business intelligence is that all decision makers can work with data independently in their preferred environment, building upon shared data measures everyone can agree with, sparking smart and insightful visualizations that spawn collaboration across teams. With the announcement of Duet AI in Looker, we continue to break down the barriers that have limited BI penetration through the entire organization – and bring an intelligent analyst to everyone.
Last year, we expanded the Looker family to include Looker Studio. We have continued our rapid pace of innovation to include the debut of Looker (Google Cloud core), bringing Looker to the Cloud console, the addition of administrative functionality to Looker Studio Pro, and further bringing Looker Studio and Looker together, with native connectivity, designed to give business users fast access to their data, unifying behind a single source of truth for metrics. The Looker vision is to offer our users the most complete BI product in the industry, and Duet AI in Looker will help us materialize this vision.
Duet AI in Looker builds on these innovations and brings direct integrations to conversational AI and large foundation models that will change the way you work with data — starting with a host of capabilities showcased in our session, What’s New In BI for Today’s GenAI World, at Next ‘23.
With Duet AI in Looker, we plan for you to be able to:
Have fast and simple conversations with your data, going beyond questions and answers to drive actions informed by your unique business.
Create entire reports or advanced visualizations with only a few sentences of instruction, greatly saving time and minimizing the need for technical expertise.
Automatically create brand new Google Slides presentations from your reports with automatically generated summaries.
Use natural language to create formulas for calculated fields, transforming the information flowing from your data sources, and use those formulas in your visualizations.
Easily generate LookML code using natural language.
Getting insights from your business data should be as easy as asking Google. That is the mission of Looker — instant and insightful, able to alert you when it really matters and guide you to impactful decisions faster than ever, powered by the most important information — yours.
By greatly simplifying the business intelligence experience, you can bring insights to all users in your organization, provide them a speed boost to get answers faster, build reports quicker, and rapidly write code that just works. This mission can redefine what you’ve come to expect from a BI tool and set you on a path to reimagine how you can build your business – with data at the center. And this initial preview is just the beginning of how we’re integrating gen AI capabilities at the heart of Looker.
Today, we are also continuing on our promise to open up our semantic layer – the heart of Looker’s modeling capabilities, to Tableau, in preview, and Microsoft Power BI, in general availability. Tableau users can explore their data with a familiar drag-and-drop experience while tapping into Looker’s modeling layer for standard metrics, while Power BI users can access those centrally defined metrics and data relationships from Looker’s semantic layer through Power BI desktop. These new data connectors add to our existing front-end options, including Looker, Looker Studio, and Connected Sheets for Looker.
We are partnering with Alteryx to provide Looker Studio users with native access to the Alteryx Designer Cloud for data preparation, and enhanced cloud connectivity, starting with Microsoft Excel and CSV files from storage platforms including Sharepoint and OneDrive.
We are also pleased to announce general availability of our partner integration with Sisu Data, later this quarter. With the combination of Looker and Sisu, customers can determine the root cause of changes, spot data outliers, and identify next steps for analysis. By augmenting analytical workflows with AI-powered analytics, customers can receive significant insights on cloud-scale data in seconds.
The gen AI revolution is propelling our entire world forward, and transforming how we approach all aspects of analyzing, building and sharing data. With Duet AI in Looker, we can tap into the smartest machine ever designed – you – and help accelerate your entire organization to ignite its next discoveries through simple tools.
Chat with your business data and design your future with a product built for it. Contact your Google Cloud representative to learn more.
Read More for the details.
Over the last few months, I’ve been blessed with a front-line opportunity to partner with many of the most forward-leaning technology companies using generative AI. I felt fortunate enough already to join Google Cloud last year — largely because I believe the companies who are able to harness modern analytics and AI are the ones who will flourish, and I believe no one is better positioned to meet this need than Google. I certainly didn’t predict the wave of demand we’d see for gen AI when I joined, but I couldn’t imagine a better place to be.
Rather than going on about why I think gen AI will transform every industry and domain, we’ve rounded up some use cases and demos showcasing what we’re working on with these partners. Thanks to all the partners listed below, we’ve had the opportunity to learn and grow with top innovators, and to help customers accelerate their gen AI agendas.
Aible gives enterprises a fast and secure path to getting value from AI. Designed for business users, Aible offers a simple interface where users can start chatting and asking questions, leveraging their structured and unstructured data in their virtual private cloud. As users provide feedback or edit chat responses, Aible improves the generative AI outputs automatically. Aible double-checks the answers of the generative AI automatically to reduce hallucinations. It also offers visualization capabilities through Looker with one click. Aible is built on Google Cloud and leverages BigQuery, Vertex AI, and PaLM 2, giving businesses more than 4x faster analytics and 9x lower costs. Aible’s solution is available on Google Cloud Marketplace.
DataStax gives organizations the cloud database and streaming technologies they need to build real-time, human-like, gen AI-powered digital experiences. DataStax’s AstraDB database-as-a-service (DBaaS) is built on Apache Cassandra, a powerful and highly scalable distributed database, and leverages Google BigQuery, Dataflow, and Vertex AI for generative AI. With DataStax, companies can build sophisticated AI assistants and enable semantic caching for gen AI, large language model (LLM) chat history, and more. They can also take advantage of vector search, which is a key capability for letting databases provide long-term information retrieval for AI applications.
Dialpad’s AI-powered customer intelligence platform helps sales and service teams have more efficient and successful customer conversations. Dialpad powers voice and video meetings on any device, and its artificial intelligence, Dialpad Ai, generates insights and tips in real time. Reps receive meeting summaries, highlights of essential information, action items, and context-sensitive suggestions and prompts as the conversation is happening. These capabilities are powered end-to-end by Google Cloud — from the infrastructure that allows Dialpad to instantly deploy and scale, to automation and assistance driven by Vertex AI.
Neo4j lets enterprises cost-effectively turn their structured and unstructured data into knowledge graphs for insights and recommendations that are accurate, transparent, and explainable. Neo4j’s graph database works with the generative AI capabilities in Vertex AI to digest data and extract entities and relationships, so that users can automatically generate knowledge graphs, reducing the costs to get started with graphs. Additionally, chatbots enable business users to interact with knowledge graphs using natural language, making knowledge graph insights accessible to users beyond data scientists.
SAP and Google Cloud offer an open data solution that uses generative AI to uncover unexpected correlations in purchase orders, revealing information on emissions and carbon footprint which were previously unavailable. Companies can bring together their previously disparate SAP and non-SAP data in a comprehensive and open data cloud through SAP Datasphere and Google Cloud Cortex Framework on BigQuery. And they can use Vertex AI and Google Cloud’s generative AI capabilities to uncover previously invisible insights from that data for virtually any use case, including sustainability, HR, commerce, procurement, finance, and manufacturing improvements.
Typeface, Typeface is an enterprise-grade generative AI platform to supercharge personalized content creation. Integration with Google’s large language models allows Typeface to unite content velocity with brand personalization and control. Google customers can now create exceptional, on-brand content faster and easier than ever before. Typeface provides industry-first self-serve solutions for complete lifecycle content development, so every employee can craft captivating content with ease, speed, and brand authenticity.
Human capital management provider UKG is leveraging Google Cloud’s foundation models such as PaLM 2, Vertex AI, and its own AI models to create and embed generative AI into its solutions. UKG gives managers and team leaders more conversational interactions, augments requests with insights, and supports leaders with better decision-making to enhance the employee experience. Teams can generate job descriptions, proactively detect bias in those descriptions, draft interview questions, and more. Additionally, employees can interact with data in a conversational way, spend more time doing the things that are more meaningful to them, and build workforces with diversity, equity, inclusion, and belonging at their core.
One exciting thread runs across all of these solutions: Gen AI is making new ideas, efficiencies, experiences, and offerings possible for organizations, their employees, and their customers. We hope these demos give you inspiration for what you can do with gen AI in your business and look forward to sharing more with you from our open and innovative AI partner ecosystem.
Read More for the details.