The Data Warrior

Changing the world, one data model at a time. How can I help you?

Archive for the tag “SQL”

ANSI SQL with Analytic Functions on Snowflake DB

Here is another installment of my Top 10 blog list of cool features of Snowflake Elastic Data Warehouse:

At Snowflake, we believe that it should be easy to access, query, and derive insights from your data. To support that, we provide our users with the ability to query all their data using ANSI compliant SQL . (Hard to call yourself a relational database otherwise, right?).

However, Snowflake goes beyond  basic SQL, delivering sophisticated analytic and windowing functions as part of our data warehouse service.

See how the Snowflake Elastic Data Warehouse supports ANSI SQL as well as sophisticated Analytic functions to allow you to derive value from all your data.

See the rest of the post here:

ANSI SQL with Analytic Functions – Snowflake

Yes SQL!


The Data Warrior

P.S. Want to learn more about Snowflake – check out our extensive library of white papers, case studies, and recorded webinars on our resources page.

Would You Like to Load a Data Vault like a Master?

If you are about to embark on a Data Vault project this year, and not quite sure the best way to load your data into a Data Vault efficiently, I have a great opportunity for you.

Dan and Sanjay have been hard at work re-vamping the Learn Data Vault site and are now ready to relaunch with a brand new look and feel.

The first course out of the gate is the SQL Implementation course. This online course walks you through the details of implementing a Data Vault-based data warehouse using plain and simple SQL. These are the same techniques I have been successfully using for over 10 years on my Data Vault (and even Non-DV) projects. I learned them from Dan in my very first Data Vault class and have used them ever since.

They just work!

If you are building on a standard relational database platform (on premises or in the cloud), this class gives you all the patterns and code examples you need to move data into not only Hubs, Links, and Sats but your staging areas as well. It also includes the techniques for implementing very efficient change data capture (i.e., delta detection).

You can see the full outline for all the topics and modules here.

The Deal

So the deal is that originally the Retail Price was  $997. But right now with the re-launched site, this class is marked down to $797  for a short time (a savings of $200). This course has sold in the past for $1497 and they could raise it back up at anytime.

The Better Deal: A Fire Sale

Starting today until February 13th, Learn Data Vault is having a Fire Sale! You can take another $400 off with this special Data Warrior Coupon Code:  DWFS400OFF.

So with the mark down price plus my coupon you can get the Data Vault SQL Implementation class for  only $397 (a savings of $600).

But remember is is only good until February 13 (1 week from today!).

UPDATE: If you are reading this after the sale is over you can still get a discount by using my special blog reader discount code: Kent10S which will get you 20% off.

So don’t delay – sign up for the class today. It is the quickest way for your to get productive with your Data Vault implementation I know. (And it does not hurt that you can uses these techniques for lots of non-data vault systems too. I have.)

Why a Fire Sale?

So if this class is so good, why such great sale?

I asked Sanjay the same thing. This is what he told me:

  1. This is really a re-launch on a new platform and they really want to give it a workout.
  2. There may be some small errors on the site that they missed during QA. (The price will go up once they are sure it is completely error free)
  3. They need feedback on the new site to be sure it is the best it can be and this is the best way to get that feedback quickly – get people using the site!
  4. It’s Valentines Day!

So please sign up ASAP before the coupon code expires.

SQL Rules!

Happy Coding!

Kent, The Data Warrior

Data Vault Master (CDVDM, CDVP2)

Snowflake SQL: Making Schema-on-Read a Reality (Part 2)

This is the 2nd of my articles on the Snowflake blog.

In the first article of this series, I discussed the Snowflake data type VARIANT, showed a simple example of how to load a VARIANT column in a table with a JSON document, and then how easy it is to query data directly from that data type. In this post I will show you how to access an array of data within the JSON document and how we handle nested arrays. Then finally I will give you an example of doing an aggregation using data in the JSON structure and how simple it is to filter your query results by referring to values within an array.

Check out the rest of the post here:

Snowflake SQL: Making Schema-on-Read a Reality (Part 2) – Snowflake



The Data Warrior

Snowflake SQL: Making Schema-on-Read a Reality (Part 1) 

This is my 1st official post on the Snowflake blog in my new role as their Technical Evangelist. It discusses getting results from semi-structured JSON data using our extensions to ANSI SQL.

Schema? I don’t need no stinking schema!

Over the last several years, I have heard this phrase schema-on-read used to explain the benefit of loading semi-structured data into a Big Data platform like Hadoop. The idea being you could delay data modeling and schema design until long after the data was loaded (so as to not slow down getting your data while waiting for those darn data modelers).

Every time I heard it, I thought (and sometimes said) – “but that implies there is a knowable schema.”  So really you are just delaying the inevitable need to understand the structure in order to derive some business value from that data. Pay me now or pay me later.

Why delay the pain?

Check out the rest of the post here:

Snowflake SQL: Making Schema-on-Read a Reality (Part 1) – Snowflake



The Data Warrior

Welcome to the Biggest KScope Conference Ever!

Day breaks over the Mighty Mississippi

Day breaks over the Mighty Mississippi

Day one at ODTUG’s KScope13 was awesome as always with a great set of symposiums and networking events.

At the speaker meeting we learned that there are 1,400 registered attendees, 30 countries represented and over 50 exhibitors making this the largest event and largest exhibit hall in the history of the Oracle Development Tools User Group (ODTUG).

So we must be doing something right to attract such a large crowd.

Part of what we do right is getting the top Oracle developers, consultants, and Oracle ACEs in the world to present. And we get stellar participation from the Product Management and Development teams at Oracle Corporation.

Even though it was Sunday, there was a tremendous turnout for our free Sunday Symposiums. In the Symposiums we had rooms dedicated to specific topics so attendees could stay in one place all day and get a series of related talks.

After conducting my now annual Morning Chi Gung class (had nine attendees!) I spent the day in the DB & Developer’s Toolbox Symposium. It had quite a line up.

First I attended Jeff Smith’s session  on SQL Developer 4.0 which will come out later this summer.

SQL Developer Product Manager, Jeff Smith of Oracle, introduces new features in the upcoming 4.0 release

SQL Developer Product Manager, Jeff Smith of Oracle, introduces new features in the upcoming 4.0 release

Got lots of great tips and trick which I am sure Jeff will be blogging about at in the near future.

Next was Oracle Technologist Tom Kyte who spoke about many new features and enhancements to SQL And PL/SQL in the upcoming Oracle 12c release of the database.

Tom Kyte (of fame) introduces attendees to new SQL and PL/SQL features in Oracle 12c

Tom Kyte (of fame) introduces attendees to new SQL and PL/SQL features in Oracle 12c

And last for my day of learning was Maria Colgan discussing changes to the Oracle Optimizer in 12c.

Oracle Product Manager, Maria Colgan, discusses the Evolution of the Oracle Optimizer: Rules based to 12c

Oracle Product Manager, Maria Colgan, discusses the Evolution of the Oracle Optimizer: Rules based to 12c

As always, Maria provided so much information my head was ready to explode from information overload. And later in the week she has a second session on this topic with even more new features to discuss. If every Oracle DBA could attend one of her sessions, all of our databases would run better!

After the sessions it was time for formal and informal networking. There was the speakers meeting and the official welcome reception in the exhibit hall. I got to attend an official Oracle ACE dinner cruise on a steam powered paddle wheel river boat complete with live jazz band. Very much in the spirit of New Orleans.

Stay tuned for more reports as the week progresses.


The Oracle Data Warrior

P.S. I got to do a little touring as well and visited the oldest Catholic Cathedral in the United States. It is quite a beautiful church inside and out.

The beautiful St Louis Cathedral near Jackson Square  in New Orleans

The beautiful St Louis Cathedral near Jackson Square in New Orleans

Post Navigation

%d bloggers like this: