On the heels of a very successful #DataCloud Summit, I am pleased to let you all know that Snowflake CEO Frank Slootman is publishing a book that really illuminates the Data Cloud and how we got here.
According to Gartner, the public cloud services market continues to grow, largely due to the data demands of modern applications and workloads. And data is one of the leading factors in this transition. In recent years, organizations have struggled with processing big data, sets of data large enough to overwhelm commercially available computing systems.
For a long time, the only real solution was data warehousing services. These services relied on specialized computer hardware to increase the scale of data processing. But these systems had major drawbacks in terms of their extremely high cost and performance constraints. Increasing scale this way wasn’t feasible for many or even most companies. With demand continuing to explode, the world desperately needed a more democratic solution for big data delivery.
A new book, The Rise of the Data Cloud, looks at how that problem can be solved over a few short years. As the founders of Snowflake came together to design a better big data solution, they built an entirely new class of cloud computing in the process.
I have been an active part of many communities over my 30+ year career. I started out attending The Rocky Mountain Oracle Users Group meetings in the late 80’s (yes we had meetups way back then).
Over the years I went from an attendee asking lots of naive, newbie questions to presenting at events, then helping organize events, and then onto the board of directors and eventually becoming president of the group. And I did this with the Oracle Development Tools User Group (ODTUG) as well.
To say I benefited from participating in those, and other, database and data communities, is an understatement.
I benefited not only in my career but also personally.
From a career perspective, I learned a massive amount that I could use in my job. Some very practical tips and tricks to build better systems for my employers and clients. The time spent at user group meetups and conferences more than paid for itself in the value returned. It allowed me to grow in my career to higher level jobs, and better pay.
I also built a HUGE network of contacts that were experts in their field. I was often able to email (or even now tweet) them questions and get immediate answers to solve problems I could not solve on my own, saving my time and therefore money.
Some of those experts were/are my mentors.
Some of those experts helped me find my next gig.
Some of those experts have now followed me to Snowflake and are helping our customers on a daily basis.
I can confidently say that my active participation in these communities led to my role as Chief Technical Evangelist at Snowflake. #LoveMyJob
I made friends all over the world! Not only did I gain a network of like minded professionals who worked in the same field as I did, many of them became friends. There are a fairly large number of folks that I have been friends with now for several decades (some of you no doubt reading this now!).
That has meant that as I have been traveling the world in my role here at Snowflake (and even before as an independent consultant), I find friendly faces in nearly every city, state, or country I visit. When you are traveling a lot for work, and away from your family, it is nice to be able to grab coffee, lunch, or dinner with someone who has known you for years just to chat about life. In our ever more digitally connected world, face to face is still the best. 🙂
That is the power of a good community!
At Snowflake, we call our community advocates and leaders, Data Heroes.
Data Heroes are recognized as Snowflake experts. They get the opportunity to meet with Snowflake product and engineering teams, receive early invitations to Snowflake events, and are welcomed to speak about their experiences with Snowflake at user group meetups, conferences, and other online events. As a Data Hero, you’ll help and educate other users by sharing knowledge, tips, and best practices–online and in real life.
You even get your own custom, Data Hero trading card and other cool Snowflake swag!
You can become one of the leaders in the new world of The Data Cloud. You might even become a SuperHero! See how to get started today.
Join us for Group By Data Heroes
This coming week on July 15, 2020, we will have our very first Data Heroes learning event that we call Group By. It is a FREE event hosted by myself and our community team with sessions from our expert Data Heroes and Super Data Heroes.
With an opening session from the Co-Founder of Snowflake, Benoit Dageville, the original #DataHero!
This will be a LIVE, interactive event that you will not want to miss so sign up today to reserve your spot and join the Snowflake Community.
Instead of having our annual in-person user event, Snowflake Summit, this year we are having a very exciting virtual event on June 2nd – #SayHellotoTheDataCloud. Join us live at 9 AM PDT, 11 AM CDT, 12PN EDT, 5 PM BST, 6 PM CEST for another Snowflake first!
You can be the first to hear from Snowflake CEO Frank Slootman, Snowflake co-founder Benoit Dageville, and Christian Kleinerman, Snowflake’s Senior VP of Product. They’ll detail how you can leverage the #DataCloud for all your data to acquire the deepest insights possible. You’ll also learn how Snowflake customers, data providers, and data service providers use Snowflake #CloudDataPlatform to seamlessly connect and collaborate with data.
What is #TheDataCloud?
#TheDataCloud is the next evolution for democratizing and governing access to data! It provides a central location for any organization to store all of its data inexpensively and as a single source of truth. It not only unites an organization’s data but it connects any two or more organizations that store their data in the Data Cloud. Enabling the Data Cloud is the Snowflake #CloudDataPlatform. This is the engine that drives the Data Cloud. With it, organizations can easily load their data onto the Data Cloud, and then securely integrate, analyze, share, and monetize their data.
Join Frank, Benoit, Christian, and other Snowflake leaders to:
Learn how Snowflake #CloudDataPlatform enables #TheDataCloud and how you can seamlessly store, unify, and analyze all your data.
Join the thousands of organizations that comprise #TheDataCloud to securely share and monetize your data globally, and acquire shared data and data services.
Hear how the latest innovations with Snowflake #CloudDataPlatform improve performance, governance, and analyst productivity.
Discover how to build fast and extensible pipelines for #DataScience, #DataWarehousing, #DataEngineering, and #DataApplications from a single platform.
Stay tuned because right after the presentations, you be part of the live Q&A session with Benoit and Christian. Ask how your organization will benefit from #TheDataCloud, and from the latest features available from Snowflake #CloudDataPlatform.
Reserve your spot today – sign up here and block your calendar!
#TheDataWarrior and Chief Technical Evangelist, Snowflake
A few months back I had the privilege of being interviewed by Tobias Macey on his Data Engineering Podcast show. This came about because Tobias actually Tweeted at me about wanting to do the interview! In this episode we spent an hour discussing the ins and outs of the Snowflake Cloud Data Platform. You can find it here. Hope you enjoy it!
How did you get involved in the area of data management?
Can you start by explaining what Snowflake is for anyone who isn’t familiar with it?
How does it compare to the other available platforms for data warehousing?
How does it differ from traditional data warehouses?
How does the performance and flexibility affect the data modeling requirements?
Snowflake is one of the data stores that is enabling the shift from an ETL to an ELT workflow. What are the features that allow for that approach and what are some of the challenges that it introduces?
Can you describe how the platform is architected and some of the ways that it has evolved as it has grown in popularity?
What are some of the current limitations that you are struggling with?
For someone getting started with Snowflake what is involved with loading data into the platform?
What is their workflow for allocating and scaling compute capacity and running analyses?
One of the interesting features enabled by your architecture is data sharing. What are some of the most interesting or unexpected uses of that capability that you have seen?
What are some other features or use cases for Snowflake that are not as well known or publicized which you think users should know about?
When is Snowflake the wrong choice?
What are some of the plans for the future of Snowflake?
This is a great podcast series, so you might want to add it to your regular list!
The Data Warrior & Chief Technical Evangelist at Snowflake