The Data Warrior

Changing the world, one data model at a time. How can I help you?

Archive for the category “SnowflakeDB”

Join the Cloud Analytics Academy

Maybe not a cool as Star Fleet Academy, but this is pretty cool.

Snowflake and a number of our partners have come together to create the first, self-paced, vendor agnostic, online training academy for analytics in the cloud. This academy will get you up to speed on what is happening today in the cloud with respect to data warehousing and analytics so that you can be a leader in your organization.

20171111_072127

Reach for the summit above the clouds!

And the best part – it is FREE!

The Cloud Analytics Academy is a training and certification program for data professionals who want to advance their skills for the technology and business demands of today’s data analytics. It’s a collective industry effort from executives at Snowflake Computing, AWS, Looker, Talend and WhereScape.

We decided to launch this academy in part to address the data and analytics skills gap, which Gartner says is “threatening to overwhelm the credibility of data use” in business. At the same time, Gartner also recently projected that the worldwide software-as-a-service (SaaS) market to grow more than 63 percent to by 2020.

The Academy is designed for data professionals of all technical and business levels and backgrounds. You can complete any or all of the following Academy tracks:

  • Executive Fast Track Learn the key technologies and techniques to foster an effective cloud analytics team.
  • Cloud Foundation Track Become proficient with the fundamental building blocks of cloud analytics.
  • Modern Data Analytics Track Learn advanced technical concepts to propel your cloud analytics.

There will be quiz questions at the end of each session to test your retention. By completing the sessions and the quizzes in a track you will get a related academy badge. Anyone who completes all three tracks will be certified as a Cloud Analytics Academy Master.

The Academy launches on November 14th with a live keynote from Tom Davenport. You can get all the details and sign up here. (Don’t worry if you miss it as it will be recorded and available the rest of the year).

So what are you waiting for? Sign up today and take your career to the next level – above the clouds!

To infinity and beyond!

Kent

The Data Warrior

 

 

Advertisements

TPC-DS at 100TB and 10TB Scale Now Available in Snowflake’s Samples

Here is another great announcement we made at Snowflake that I missed sending out to everyone. This is really cool to have a full set of data to test on you own. If you want to give it a try, sign up for a FREE Snowflake account.

Get you TPC-DS data here

We are happy to announce that a full 100 TB version of TPC-DS data, along with samples of all the benchmark’s 99 queries, are available now to all Snowflake customers for exploration and testing. We also provide a 10TB version if you are interested in smaller scale testing.The STORE_SALES sub-schema from the TPC-DS BenchmarkSource: TPC Benchmark™ DS Specification

For all the details, continue reading here: TPC-DS at 100TB and 10TB Scale Now Available in Snowflake’s Samples

Enjoy!

Kent

New Snowflake features released in Q2’17 

I have been busy lately preparing and delivering quite a few talks so got a bit behind on my blogging and reporting. So in an effort to catch up a bit, here are some details on developments at Snowflake:

Q2 1017 Features

It has been an incredible few months at Snowflake. Along with the introduction of self-service and numerous other features added in the last quarter, we have witnessed:

  • Our customer base has grown exponentially with large numbers of applications in full production.
  • Billions of analytical jobs successfully executed this year alone, with petabytes of data stored in Snowflake today, and without a single failed deployment to-date.
  • A strong interest in pushing the boundaries for data warehousing even further by allowing everyone in organizations to share, access and analyze data.

Continuing to engage closely with our customers during this rapid growth period, we rolled out key new product capabilities throughout the second quarter.

Get the rest of the details here: New Snowflake features released in Q2’17

Cheers

Kent

The Snowflake Data Sharehouse. Wow!

Data Sharing for All Your Data

They say the Internet changed everything…

Then Big Data changed everything…

Then the Cloud changed everything…

Well my friends, Snowflake‘s announcement of its new data sharing feature has changed the game again! Your data warehouse in the cloud can now be a data sharehouse.

Building on all these technology evolutions, Snowflake has taken what we can now do with big data in a cloud-native data warehouse to whole new level by introducing, what I like to think of as Data Sharing as a Service (DSaaS).

This may be my new #1 favorite feature of Snowflake.

What is Snowflake Data Sharing?

Snowflake Data Sharing is a new feature that lets you easily, seamlessly, and securely, share tables, views, even entire databases with anyone inside the Snowflake ecosystem, in a read only mode. They can then query the data from within their own Snowflake account and even join it to their own internal data as if it was all in their database.

Snowflake Data Sharing architecture

That means no more needed to reformat and export data to flat files so they can be transmitted (via secure FTP or some other transfer protocol) to then be loaded into your customer’s or partner’s database.

All that time and effort – gone!

Data extraction process – gone!

Data movement – gone!

Data latency – gone!

Extra storage – gone!

You create your database, load the data, then share the data. And once the data object is shared, as you add more data or update the data set, those changes are immediately available for the data consumers to query. No more wasted time waiting for an incremental update file to be built and transmitted.

And you have complete control on who sees what data. In fact you can revoke anyones access instantly with a single command.

Oh – did I mention that the new feature is FREE to all Snowflake customers. It is built into the standard edition! (That’s just crazy!)

How does it work?

The reason that only Snowflake can do this is because of its unique multi-cluster, shared data architecture that completely separates compute resources from storage. That is why the data can be stored once (by the data provider) and then be shared to an unlimited number of data consumers. The global meta data and security services in Snowflake’s cloud services layer are key components that allow sharing to be not only fast but secure. With independent compute clusters (i.e., virtual warehouses), data consumers can use whatever amount of compute they require to query and use the shared data without impact on either the data provider or other data consumers.

So the basic process for data sharing is simple:

  1. Data Provider creates a share container with the objects (databases, schemas, tables, or views) to be shared.
  2. Data Provider then grants a Data Consumer account access to the share.
  3. Data Consumer creates new database that maps to the shared object(s).
  4. Data Consumer then grants access privileges to a role in their account
  5. Data Consumer starts querying, using the privileged role and their virtual warehouse.

Snowflake Data Sharing setup

Code examples:

Data Provider code:

Here is a scenario where the data provider wants to share just a single table in a database to several accounts. This approach allows the provider to verify the configuration and contents of the share before making it visible to other accounts (this is the recommended approach).

CREATE SHARE sales_s1; -- create an empty share

GRANT USAGE on DATABASE sales to SHARE sales_s1; -- add database

GRANT USAGE on SCHEMA sales.east to SHARE sales_s1; -- add schema

GRANT SELECT on TABLE sales.east.new_orders 
             to SHARE sales_s1; -- add table

SHOW SHARES;

ALTER SHARE sales_s1 ADD ACCOUNTS=a1, a2, a3; -- add accounts

Data Consumer code:

On the consumer side, each account would create a database from the share sales_s1, then grant access to the new database in order to access the table NEW_ORDERS.

CREATE DATABASE External_SalesData from SHARE ProviderAcct1.sales_s1;

GRANT IMPORTED PRIVILEGES on DATABASE External_SalesData to MyRole;

Security – Revoking a Share

If for some reason a Data Provider needs to stop sharing their data either to a single account or to everyone, that is also easy to do. They can either REVOKE the privileges granted or completely DROP the share.

REVOKE SELECT ON TABLE sales.east.new_orders
  FROM SHARE sales_s1;

or just

DROP SHARE sales_s1;

Unlimited Possibilities for the New Data Economy

So, how can your business change and grow with this capability (that costs you nothing)? Do you have partners that have wanted access to your data but found it too difficult to engineer that data pipeline? Is there a market for your data, and the insights it provides, that you have not even explored?

This feature redefines the old Data Warehouse into a modern Data Sharehouse that lets you derive even more value from all your data – with no limits.

With Snowflake Data Sharing, you can now transform your data into a valuable, strategic business asset.

For More Information

For more details on Snowflake Data Sharing, check out these posts:

https://www.snowflake.net/data-sharehouse-brings-forth-new-market/

https://www.snowflake.net/data-sharehouse/

Then download the free ebook “From Data Warehouse to Data Sharehouse” for an even more in-depth look at Snowflake Data Sharing

And signup for the live webinar “A Deeper Look at Data Sharing” coming next week.

So what do you think? How could this change your business?

Cheers.

Kent

The Data Warrior

Snowflake at Stoweflake

20170519_073717

Every year the World Wide Data Vault Consortium (WWDVC) gets better and better! This year’s event was the 4th Annual and was again held at the lovely Stoweflake Mountain Lodge in Stowe, Vermont.

WWDVC_StoweflakeBalloon

And once again this year, my employer, Snowflake Computing, was a proud sponsor of the event. This year I even got to help with a hands on workshop with our ELT partner Talend as we walked folks through building a Data Vault in Snowflake using Talend!

WWDVC Sponsors

100 attendees got their minds filled and horizons broadened by an amazing slate of presentations given by great speakers from all over the world. Not only did we hear some real-life case studies from companies like Micron and Intact Financials (who have VERY large data vaults) but we even got to hear from someone at the US DoD (yes the Department of Defense!).

Then there were these mind-bending talks that challenged the most experienced in the audience:

– Measuring Data as an Asset (by Nols Ebersohn)
– How to get a DV project Approved (by Neil Strange)
– Uncertainty, Risk, & The Value of Information (by Brad Bergh)

And there were of course absolutely awesome keynotes from Tamara Dull (A Big Data Cheat Sheet for the Technically Savvy Data Professional) and from Scott Ambler we heard (very clearly!): Are You Agile or Are You Fragile?

What? You missed WWDVC 2017?

Well I guess you will have to wait for the 5th Annual WWDVC in 2018…

WWDVC Sunset

Can’t wait? You are in luck!

This year Dan hired professional videographers to record the entire event.

Yup, all the workshops, the keynotes, and all the presentations.

I have seen the videos and they came out great.

So, if you would like to join the elite group of 100 data vault aficionados that attended WWDVC17, you now have the chance to see and hear the same great content we all were exposed to. Then you can be the champion for brining Data Vault 2.0 to your organization.

You can purchase access to all the recordings right here.

NB: There are no refunds on this purchase out of consideration for those who spent the time and money to attend the event.

Here’s what you get:

Pre-Conference Sessions

  • Brainstorming with Dan Linstedt (Inventor of the Data Vault and DV 2.0), Michael Olschimke (Co-Author, Building a Scalable Data Warehouse with DV 2.0) and Sanjay Pande (co-founder, LearnDataVault)
  • Talend and Snowflake Hand’s On Session
  • WhereScape Hands On Session
  • Analytix DS Hands On Workshop

Conference Sessions

Day 1

  • Keynote I – A Big Data Cheat Sheet for the Technically Savvy Data Professional (Tamara Dull, Director of Emerging Technologies, SAS)
  • Implementing a Data Vault 2.0 in the DoD (Cynthia Meyersohn, Senior Technical Consultant, Quadrint)
  • Data Vault 2.0 and the Power of Metadata (Steven Mellare, Data and Information Architect and Strategist, Pepper Money)
  • Software Defined Data Warehouse Using Data Vault 2.0 (Tevje Olin, Data Architect and Consultant, Solita)
  • Big Data Vault at Micron (Mike Magalsky, Enterprise Data Architect, Micron and Chris Sundstrom, Principal Data Architect, IM Flash Technologies)
  • Talend in the world of Data Vault (Dale Anderson, Customer Success Architect, Talend)
  • Analytix DS (Sam Benedict, VP Strategic Accounts, Analytix DS)
  • Data Mining in the Data Vault (Michael Olschimke, CEO, ScaleFree)

Day 2

  • Keynote II – Are You Agile or Are You Fragile? (Scott Ambler, Senior Consulting Partner, Scott Ambler + Associates)
  • Agile Methods and Data Warehousing: How to Deliver Faster (Kent Graziano, Senior Technical Evangelist, Snowflake Computing)
  • No DV is an Island: What Lies Beyond (Nols Ebersohn, Principal Architect, Certus Solutions Limited)
  • Beyond a Hadoop DV 2.0 Data Warehouse (Sanjay Pande, Co-Founder, LearnDataVault.com)
  • Business Vault Creation using a Rules Engine (Bruce McCartney, Senior Information Architect, First4 Database Partners)
  • Getting a Data Vault Project Approved (Neil Strange, Founder and MD, Business Thinking)
  • A Data Modeler and Process Modeler Walk into a Data Vault (John Giles, Independent Consultant and author of “The Nimble Elephant”)
  • WhereScape Automation Enabling Data Vault 2.0 (Neil Barton, CTO, WhereScape and Paul Watson Gover, Senior Solution Architect, WhereScape)

Day 3

  • Data Vault Automation – An OnGoing Story at Intact Financial (Francois Trudeau, Application Architect for Enterprise Information Systems, Intact Financial)
  • Moving to the Cloud, Metadata Driven Automation at Yale (Robert Scott, CTO, EON Collective)
  • Uncertainty, Risk and the Value of Information (Brad Bergh, Enterprise Information Consultant)

The only thing you miss out on is the great food and of course the in-person networking. So put WWDVC18 on your calendar (May 2018) but in the meantime get started by purchasing the videos from WWDVC17 now.

wwdvc2017

Hopefully seeing these talk may even inspire you to not only attend next year but maybe even speak yourself!

Enjoy!

Kent

The Data Warrior & Data Vault Master

P.S. Of course if you have any questions or want to learn more about Snowflake, the 1st cloud-native data warehouse as a service, please reach out to me or follow me on twitter @kentgraziano.

Post Navigation

%d bloggers like this: