The Data Warrior

Changing the world, one data model at a time. How can I help you?

TPC-DS at 100TB and 10TB Scale Now Available in Snowflake’s Samples

Here is another great announcement we made at Snowflake that I missed sending out to everyone. This is really cool to have a full set of data to test on you own. If you want to give it a try, sign up for a FREE Snowflake account.

Get you TPC-DS data here

We are happy to announce that a full 100 TB version of TPC-DS data, along with samples of all the benchmark’s 99 queries, are available now to all Snowflake customers for exploration and testing. We also provide a 10TB version if you are interested in smaller scale testing.The STORE_SALES sub-schema from the TPC-DS BenchmarkSource: TPC Benchmark™ DS Specification

For all the details, continue reading here: TPC-DS at 100TB and 10TB Scale Now Available in Snowflake’s Samples

Enjoy!

Kent

Advertisements

New Snowflake features released in Q2’17 

I have been busy lately preparing and delivering quite a few talks so got a bit behind on my blogging and reporting. So in an effort to catch up a bit, here are some details on developments at Snowflake:

Q2 1017 Features

It has been an incredible few months at Snowflake. Along with the introduction of self-service and numerous other features added in the last quarter, we have witnessed:

  • Our customer base has grown exponentially with large numbers of applications in full production.
  • Billions of analytical jobs successfully executed this year alone, with petabytes of data stored in Snowflake today, and without a single failed deployment to-date.
  • A strong interest in pushing the boundaries for data warehousing even further by allowing everyone in organizations to share, access and analyze data.

Continuing to engage closely with our customers during this rapid growth period, we rolled out key new product capabilities throughout the second quarter.

Get the rest of the details here: New Snowflake features released in Q2’17

Cheers

Kent

OOW is just around the corner

Yes it is indeed time for Oracle Open World. Unfortunately I will not be there speaking myself this year, but do not despair my friends, there is still plenty of goodness from the community such as these talks from my good friend Heli. She is coming all the way from Helsinki, Finland to share her knowledge. Go to one of her talks and tell her The Data Warrior sent you!

HeliFromFinland

I cannot believe how fast time flies! It has been an exceptionally busy year and that is the reason I have not had time to write blog posts either. I have been speaking in amazing events like BIWA Summit, APEX Connect, Riga DevDays 2017, OTNEMEA Tour in Baku and in Madrid, BGOUG, E4, KScope17, Oracle Code Bengaluru, the first ever APEX Day in Finland, Oracle Code Seoul, POUG, and Tajikistan TechDay. Next on my list is Oracle OpenWorld. As always it will be a busy week.

I will fly to San Francisco on Thursday to attend the ACE Director briefing on Friday at the Oracle HQ. I will have three talks:

OOW_session_catalog

And I will also have two short talks:

  • one at the EOUC trackEOUC.jpg
  • and another one at the WIT Lightning Storm sessionWIT

I will have several meetings, swimming, running and chocolate tasking

https://www.eventbrite.com/e/chocolate-tasting-at-oracle-openworld-2017-tickets-38069813838

Just to mention some of the…

View original post 57 more words

Get Certified! #DataVault 2.0 Certification in the US

Quick update – if you have been waiting to get your Data Vault 2.0 certification there are three sessions coming in the new few months right in the USA.  If you already know you want to do that, just skip down to the links and sign up!

Why Data Vault?

The Data Vault 2.0 architecture gives you an entire systems based approach to developing a true enterprise data warehouse and analytics architecture. It is very structured, pattern based, and highly repeatable. In Data Vault, each component does it’s duty, and does it well. The engineering components are generally relegated to automation tools (because it is pattern based), so human effort is not wasted in the mundane and can be used in more interesting, intelligent and thinking tasks. It’s a much better use of intelligent beings as well as machines.

Separating the concerns makes design and development not just easy, but fast.

As a side effect projects using Data Vault 2.0 have always saved a lot of money and have been extremely successful with their predictable goals. Plus they are very resilient so they tend to stay in use for years to come with little or no re-engineering! One of my systems has been running for 14 years now – and was even successfully re-platformed in that time.

How do you get in on this innovative approach?

If you want to learn more (and why wouldn’t you?), there are many upcoming opportunities across the world to get more information about Data Vault 2.0 (just check Twitter or LinkedIn – look for #DataVault). If you understand it, and you want to use it to leverage your own successes, you can even get certified (That comes with a responsibility though).

Here’s a list of upcoming opportunities to get DV 2.0 certified in the US:

1. Sep 19-21, Chicago, IL – http://www.performanceg2.com/agile-bi-datavault-training/

2. Oct 2-4, New York City, NY – http://www.scalefree.com/2017/03/30/data-vault-2-0-boot-camp-and-certification-new-york-oct-2017/

3. Nov 27-29, Santa Clara, CA – http://www.scalefree.com/2017/03/29/data-vault-2-0-boot-camp-and-certification-santaclara-nov-2017/

Ready to challenge the status quo and become a data champion at your organization? Then sign up for one of these classes today!

Model on!

Kent

The Data Warrior

#SQLDevModeler Tip: From Domain to Database… A Comment Conundrum

Great tip on creating a custom transformation script in SQL Developer Data Modeler (SDDM) from the awesome David Schleis:

Recently on the Data Modeler Forum, I came across this question:

Is it possible to mirror domain comments, from Domain Administration into attribute “Comments in RDBMS”?  Would like to mirror these to the ddl so they can be then available in column comments in the database.

I knew that one of the example transformation scripts provided with Data Modeler copies the column “Comments” property to “Comments in RDBMS”, so I thought I would point this out to the questioner, and that would be that.  But….

See the rest of the story… From Domain to Database… A Comment Conundrum

Model on!

Kent

The Data Warrior

Post Navigation

%d bloggers like this: