Azure Cosmos DB Conf

Live Stream Schedule

Azure Cosmos DB Conf - Keynote

Join Director of Product Management for Azure Cosmos DB, Kirill Gavrylyuk to kick off Azure Cosmos DB Conf.

Kirill Gavrylyuk

Online and Real-Time Database Migration to Azure Cosmos DB with Striim

There is significant demand for moving and continuous data integration as workloads shift to the cloud. Modernizing databases by offloading workloads to cloud requires building real-time data pipelines from legacy systems. The Striim® platform is an enterprise-grade cloud data integration solution that continuously ingests, processes, and delivers high volumes of streaming data from diverse sources, on-premises or in the cloud. Cloud architects, data architects, and data engineers can use Striim to move data into Azure Cosmos DB in a consumable form, quickly and with sub-second latency to easily run critical transactional and analytical workloads in Microsoft Azure. We will provide a live demo showing how Striim is used to migrate enterprise databases to Azure continuously and in real time with no downtime. In this session you will learn how to: Prepare Azure Cosmos DB for data integration with automatic target schema and tables creation that reflects the source database Set up in-flight transformations right in the GUI to minimize end-to-end latency enable real-time analytics & operational reporting Deploy zero-downtime migrations to Azure Cosmos DB from existing enterprise databases anywhere Increase IT productivity and reduce cost of ownership by integrating and enriching data from multiple sources

Andrew Liu

Alok Pareek

Data and Different System Architectures to Build a Truly Unified View

Digital components enterprises need to develop a 360-degree view of their customers and the capabilities and limitations inherent in the technologies available to help them do that. In this session, Sandeep Nawathe will cover a new approach — one that unifies disparate data sources and system architectures to develop a Unified Profile that is actionable in real-time.

Sandeep Nawathe

Bridge to Azure: Streaming data into Cosmos DB with Confluent

Confluent enables large scale, big data pipelines that automate real-time data movement across any systems, applications, and architectures at massive scale. Empowering organizations to aggregate, transform, and move data from on-premises legacy services, private clouds, or public clouds and into Azure. Confluent and Microsoft’s Commercial Software Engineering group have worked together to build a self-managed connector. The Azure Cosmos DB Connector provides a new integration capability between Azure Cosmos DB and Confluent Platform. Microsoft provides enterprise support for the Azure Cosmos DB Connector. Want to know more about your options for Bridge to Cloud with Confluent? During this session, you will learn more about: -Getting started with Confluent on Azure -Design patterns for Bridge to Cloud with Confluent -Quickly install the Azure Blob Storage Connector without having to spin up additional infrastructure -Tips and tricks for installing and configuring the Cosmos DB Self Managed Connector

Alicia Moniz

Data Modeling and Partitioning in Azure Cosmos DB

For many newcomers to Cosmos DB, the learning process starts with data modeling and partitioning. How should you structure your model? When should you combine multiple entity types in a single container? Should you de-normalize your entities? What’s the best partition key for your data? In this session, we discuss the key strategies for modeling and partitioning data effectively in Cosmos DB. Using a real-world NoSQL example based on the AdventureWorks relational database, we explore key Cosmos DB concepts—request units (RUs), partitioning, and data modeling—and how their understanding guides the path to a data model that yields the best performance and scalability. Attend this session, and acquire the critical skills you’ll need to design the optimal database for Cosmos DB.

Leonard Lobel

Developing HTAP Analytical Solutions with Azure Cosmos DB and Azure Synapse Analytics

Using Azure Synapse Link for Azure Cosmos DB, we can enable cloud-native hybrid transactional and analytical processing on our Cosmos DB data. This enables data engineers, analysts and data scientists to perform advanced analytics on data living in Cosmos DB without affecting the performance of transactional workloads inside Cosmos DB. In this session, I'll show you how you can use Azure Synapse Link for Cosmos DB to get near real-time insights into operational data with no ETL required and no impact on operational workloads. By the end of this session, you will know what Synapse Link for Cosmos DB is and how it works with the Cosmos DB Analytical Store, how to enable Synapse Link for your Cosmos DB containers, how you can analyze your data using Synapse Apache Spark or SQL Serverless and in what situations to use Synapse Link for Cosmos DB.

Will Velida

Cosmos DB for the SQL Professional

Knowing data is crucial when it comes to design - and SQL Professionals have this skill refined. But knowing implementation detail is also important. While there are similarities between these two technologies, there are also some large differences - some of which may surprise the SQL Professional. Sometimes an implementation oversight can bite you hard at the wrong time - and at a time when correcting that oversight might be costly and problematic. Come along to this session and find out why a Primary Key is not what you think, tables are not equal to containers and a few other details that might surprise you. This short and sharp session should set you on your way to becoming as good at Cosmos as you are at SQL.

Martin Catherall

GraphQL over CosmosDB, where to start

The popularity of GraphQL can't be denied and it fits really nicely with a schemaless data model like we can achieve with GraphQL. But how do we get started with it? What options do we have when it comes to running GraphQL in Azure? How can we get rich type safety between our GraphQL schema and CosmosDB models? In this session we'll go from zero to hero and get you running your first GraphQL server underpinned with CosmosDB, ready for your next application.

Aaron Powell

Real-Time Scoring to Improve Personalization with Adobe Experience Platform

For an Insurance company, we generated real-time sales predictions for anonymous, unknown website visitors using Adobe Experience Platform. We discuss how we used Adobe Experience Platform and Data Science Workspace services to provide brands with the ability to use ML Models to score customer behavior in real-time to personalize and contextualize personalization in order to increase conversions.

Kalaiyarasan Venkatesan

Learnings from building high scale low latency applications

The session will go through things to keep in mind while building systems that have low latency, high throughput requirements (less than 10 ms latency and throughput of hundreds of thousands to million QPS) that use distributed data stores (such as Cosmos DB and Aerospike) based on experiences at InMobi.

Supreeth Selvaraj

Design and implementation of Cosmos DB Change Feed-centric architecture

This session will present actual development cases of design based on Lambda architecture using Cosmos DB Change Feed and implementation patterns using Azure Serverless technology. This session will be presented by two Microsoft MVPs (Azure) based on their experience. (Delivered in Japanese).

Kazuyuki Miyake

Tatsuro Shibamura

Modern infrastructure for modern apps with Cosmos DB, AKS and Pulumi

Building scalable applications begins with adopting scalable architecture patterns in Azure. Pulumi lets you define and deploy resources in Azure using .NET, Node.js, Python, and Go. The next-generation Azure provider gives you access to 100% of Azure with same-day support for all new features. Join Azure MVP, Mikhail Shilkov as he shows you how to build a foundation for your modern apps with CosmosDB, AKS, and Pulumi.

Mikhail Shilkov

Cosmos DB + Azure Functions: A serverless database processing

With the native integration between Cosmos DB and Azure Functions, you can create database triggers, input bindings, and output bindings directly from your account. Using Azure Functions and Cosmos DB, you can create and deploy event-driven serverless apps with low-latency access to rich data for a global user base. In this session, we'll explore what it takes to setup a serverless environment capable of performing CRUD operations on a Cosmos DB account, as well as some recommendations and use cases.

Luis Beltran

Data modeling and schema design for Cosmos DB

Facilitate migration from RDBMS to Cosmos DB through denormalization to leverage the scalability, flexibility, and easy evolution of schemas for Cosmos DB. In this session, we will demonstrate how data modeling helps harness the power of Cosmos DB, resulting in reduced development time, increased application quality, and lower execution risks across the enterprise. The Hackolade data modeling tool also supports forward- and reverse-engineering use cases with the Core SQL API, the MongoDB API, and the Gremlin API.

Pascal Desmarets

Integrating Azure Cosmos DB in your Cloud Solution

When there is a NoSQL requirement in your cloud solution on the Azure Cloud Platform, there is a managed service available with Cosmos DB. Azure Cosmos DB is Microsoft’s offering for a distributed NoSQL service in the Cloud. As a Cloud Architect, I designed solutions leveraging Azure Cosmos DB, and I like to share my experience in this session. You can create cloud solutions with Azure Cosmos DB integrating them with other services such as Azure Search, Functions, and Logic Apps. In the session, I will show the integration, combined with real-world use cases and demos.

Steef-Jan Wiggers

Vercel + CosmosDB: The Journey

In this session I'd like to go through the most important challenges we had to solve at Vercel using CosmosDB. During an almost 4 years journey we had to face a variety of challenges that could be of interest for attendees such as: - Tracing operations. - Custom metrics reporting. - Repartition of migrations. - Custom indexing across partitioned collections. - Many to many relationships. - Performance optimizations. - Helper functions. I'd mostly focus on the features we have built that worked pretty well for us and sharing the stunning throughput we have had with CosmosDB.

Javi Velasco

Operational Triumph with Cosmos DB

Since its official release in 2017 ASOS has been using Cosmos DB to reimagine our Commerce platform. Let’s take a look at how Cosmos DB has been vital to conquer our operational data needs while remaining flexible for peak periods such as Black Friday. By combining Cosmos DB and Event Hubs we have developed robust hassle-free Big Data ingestion pipelines. We'll finish by showing how we have adopted Data Mesh principles and developed a distributed lambda architecture.

Gary Strange

Amrish Patel

On Demand Sessions

Blowing your mind with F#

F# is the lesser known of the .NET programming languages, but it can pack a punch when it comes to building applications. But what does that have to do with CosmosDB? What if I told you that you could get compile time checking of the validity of your database access, the collections to query and the queries themselves? I'd say that you'll have to join me and find out!

Aaron Powell

Partitioning Tips for Cosmos DB to Increase Performance and Save Money

Choosing the right partition key in Cosmos DB is complicated, but the right key will help you both increase the performance of your application and reduce the number of RUs you need to provision. There are several things to consider when picking a good key including how evenly it spreads both the physical layout of your data and the volume of requests across all partitions. In this session we will walk through a list of questions to ask yourself when picking a good key and we will explore a few real world customer scenarios.

Justine Cocchi

Real World NoSQL design patterns with Azure Cosmos DB

In this session we will go over several real world NoSQL design and solution patterns to illustrate how CosmosDB is used to help customers to solve critical business challenges and focus on best practices of building distributed scalable and resilient systems.

Sergiy Smyrnov

A deep-dive into the Cosmos DB repository-pattern .NET SDK

A code-heavy presentation introducing the Cosmos DB repository-pattern .NET SDK from the author of the library. In this presentation, David Pine of Microsoft, will demonstrate the inner-workings of the repository-pattern .NET SDK that wraps the official Azure Cosmos DB .NET SDK, simplifying the consumption of the service offering. The SDK boasts: - Configuration to optimize bandwidth - Optional container per item type - Bulk execution - Async APIs - Advanced partitioning strategy - Simple & elegant IRepository interface https://github.com/IEvangelist/azure-cosmos-dotnet-repository

David Pine

From the trenches: Building an awesome multitenant SaaS with Cosmos DB and Azure

A real-world case study of how Whally, a multitenant SaaS startup, built a modern platform from scratch on Cosmos DB and Azure. Noah shows the design and implementation decisions he made related to partitioning, data modeling, secure multitenancy, performance, real-time streaming from change feed to SignalR and more, all using ASP.NET Core on Azure App Services. This session aims to go beyond the docs to show a real example of how Cosmos DB is powering apps in the wild.

Noah Stahl

Explore Graph Analytical use-case with Azure Cosmos Gremlin API and Azure Synapse Spark graphframes

In this session you will get introduced to Azure Cosmos Gremlin API, Azure Synapse Link, Spark and learn Cosmos Gremlin API best practices as well as how to use combination of these technologies to explore your Graph data for common Graph Analytical use-cases using Spark graphframes library.

Sergiy Smyrnov

Integrating Cosmos DB with Azure Functions

A session that explains how to use the input and output bindings and Cosmos DB Triggers within an Azure Function Application using C#.

Gabriela Martínez

How does Azure Cosmos DB work under the hood?

To master any technology, you need to understand the foundation of how it works on the back-end first. In this session; We will explore Azure Cosmos DB's infrastructure. I will explain you how Azure Cosmos DB works and handles data in the back-end. We will cover the basics and some of the most misunderstood features of Azure Cosmos DB. Learning new technologies make you a better leader and collaborator. Join me to learn more about this new Cloud Database technology.

Hasan Savran

Implementing an Event Sourcing strategy on Azure

In recent years the Event Sourcing pattern has become increasingly popular. By storing a history of events it enables us to decouple the storage of data from the implementation of the logic around it. And we can rebuild the state of our data to any point in time, giving us a wide range of opportunities around auditing and compensation. In this demo-heavy session you will learn how we can use Azure Event Hubs to process and store these events to build our own event store based on Cosmos DB. Moreover, we will also dive into options around connecting to other Azure services and even Kafka applications to easily implement this popular pattern in our own solutions.

Eldert Grootenboer

Olena Borzenko

Optimizing Request Unit Consumption

Cosmos Db is a "Database as a Service" product, where we don't have control of hardware configuration but instead use "request units" to specify the resources of our database. Nonetheless, this doesn't mean that there is nothing to do on the performance side, and there are specific ways to get more out of our request units and deliver our data even faster. From consistency, indexing, connectivity settings, global replication, we will cover everything that can have an impact on your throughput and latency. In this demo-heavy session we will go into the request units concept and look at how all these different settings impact consumption and cause an increase or decrease of query and data modification throughput.

Warner Chaves

Cosmos DB for everyone using Graphistry' no-code and low-code GPU visual graph analytics

From security and fraud to sales and marketing, teams are adopting graph analytics to understand the relationships across their data. CosmosDB solves how to execute Gremlin graph queries, which is great for software.  However, for people in an organization to also successfully harness graph reasoning by exploring and interacting with their data, they also need tools. We share how to bridge those gaps by combining CosmosDB with Graphistry. Whether it's a business analyst needing a customer 360, an investigator scoping an incident, a data scientist iterating on a model, or a developer building solutions, we'll share examples of how to quickly explore their CosmosDB data and create point-and-click graph solutions. Importantly, we show how to specialize the interaction style to their preference on coding vs point-and-click. Our talk focuses on three emerging capabilities in visual graph computing for enabling CosmosDB analysts: No-code visual graph analysis for everyone, low-coding specialized for analysts vs researchers vs engineers, and scaling visual insights with GPUs. We'll use examples from our enterprise and federal uses such as on identity data, logs, ML, and social media.

Leo Meyerovich

Access Azure Cosmos DB with Entity Framework Core

EF Core is the data access API of choice for .NET developers. Although considered by most to be an object relational mapper (ORM), the team recently added support for the Azure Cosmos DB SQL API. Learn how EF Core simplifies building Cosmos DB apps with hands-on demos, explore the pros and cons of using EF Core and receive the tools and tips that will help you decide whether EF Core is the right solution for your needs.

Jeremy Likness

Architecting Cloud Native Apps with AKS and Cosmos DB

In this session will look at modern app development and how using Kubernetes (in this case AKS) presents a number of advantages, couple this with a native Azure service like Cosmos DB can be a winning combination. We will examine running databases in containers such as Mongo DB and we will also discuss how Kubernetes and Cosmos DB can compliment each other to provide an enterprise grade solution.

Ben Griffin

Visualizing Cosmos DB Graph Data Over Time and Space

When building an application that takes advantage of graph data, selecting the data source and getting the data model right is seen as the entire solution but it is often only the first step. You then need to figure out how to present that data back to end users in a way that actually helps them make decisions that solve problems. In this session, Corey will discuss solutions and designs for traditional node-link diagrams using libraries from Cosmos partner Cambridge Intelligence, and will also show strategies for communicating temporal patterns in graphs using Cambridge Intelligence's new product, KronoGraph.

Corey Lanum

Migrating to Cosmos DB's API for MongoDB

Azure Cosmos DB's API for MongoDB is Microsoft’s implementation of the wire protocol for MongoDB built on top of the foundational Azure Cosmos DB engine. This session will be focused on migration to Cosmos DB's API for MongoDB from on-premises or cloud instances of MongoDB using native MongoDB utilities such as mongodump as well as Database Migration Service. We will be talking about the advantages, pre-requisites and considerations for both the methods while walking through a demo of the migration for a MEAN stack application.

Shweta Nayak

Experiences from 7 years of Cosmos DB

Cosmos DB is a powerful tool. It may look simple on the cover, since "it just works", but the performance and cost optimizations are rarely simple. This presentation looks at Cosmos DB from a few different viewpoints. We look at experiences gained from some real-world cases and stories, from mistakes made and successful optimizations done. At the end, we draw a couple of best practices from these experiences. This presentation is aimed for developers / data engineers, and can at times be on advanced level, although for the most part the talk will be aimed at intermediate audience.

Sakari Nahi

Real Life migration from PostgreSQL to Cosmos DB

Follow us in the migration process of an existing customer application working on PostgreSQL to Cosmos DB. We will show you step by step the different iterations of the project, from the initial discovery of the relational data model to the optimization of the application to work efficiently with Cosmos DB.

Julien Michel

Rumen Krastev

Search and Explore your Cosmos DB data with Azure Cognitive Search

Organizations what to make full use of their data and often this means being able to gain more insights, explore and search this data. In this session, we will explain how Azure Cognitive Search, an AI enabled PAAS search service, can be integrated with Cosmos DB to not only extract critical knowledge from this data, but then enable users to search, find answer to questions and extract deep insights found within this data.

Liam Cavanagh

Ready to Get Started?

Explore Azure Cosmos DB and see it in action.