This week I’m attending the SQL PASS Summit in Seattle. I’ll be live blogging each of the keynote presentations on Wednesday and Thursday morning. This post will be updated throughout the duration of the Day 1 keynote.
We’re off and running! PASS president Adam Jorgensen starts us off with some opening remarks about PASS, Microsoft. Adam shares that PASS has been polling members to find out more about who we are, and have identified 10 different role-based personas. PASS is now 250,000 members strong.
Adam announces that Malathi Mahadevan has won the PASSion Award this year. I can’t think of anyone more deserving. Congratulations, Mala!
Get out and connect! Adam recommends making 5 new contacts this week.
Joseph Sirosh takes the stage for Microsoft. He describes “A.C.I.D. Intelligence”: Algorithms, Cloud, IOT, Data. Love the intersection, but I don’t like (yet another) overloaded acronym.
Rohan Kumar comes aboard to speak a bit about Intelligence DB. He reports that with in-memory intelligent processing, they have seen 100x faster analytics and 30x faster transactions. He announces that fully managed HTAP in Azure SQL DB will be generally available on November 15th. In a very high-level demo, he shows a ramp-up test that has SQL Server processing over 1,000,000 analytical transactions – with fraud detection algorithms – per second. He illustrates a real-world scenario in which Jack Henry and Associates (a client of mine) was able to process the analysis of 20MM loans in just 15 seconds.
Next up, Justin Silver, data scientist from PROS steps in. His company provides pricing data to airlines, servicing over 50% of the price points in the airline industry. They use Azure and R to do this processing, which requires millisecond response times.
Rohan returns to the stage to show a brief example of querying multiple types of data sources at once, using PolyBase, to integrate the results. It’s a much better solution than linked servers.
He continues by showing SQL Server on Linux. At the same time, the lights start flickering. Coincidence? Very cool demo, though – he connected to a SQL Server instance running on Linux and restored a SQL Server database backup from an instance running on Windows.
Next up we’re talking Azure DocumentDB with the CEO of Next Games. We start with a trailer of their Walking Dead-themed game (which is completely lost on me since I’m not a fan of the show).
Joseph Sirosh returns to talk about data scale and Azure SQL Data Warehouse. He illustrates the compounding of data scale by comparing a grain of rice to a byte, a kilobyte to a cup of rice, all the way up to a yottabyte which would be an Earth-sized ball of rice.
A couple of new announcements:
- A one-month free trial is now available for Azure SQL Data Warehouse: https://azure.microsoft.com/en-us/services/sql-data-warehouse/extended-trial/
- A preview of Azure Analysis Services is now available
Joseph hands off to Julie Koesmarno who talks a bit about sentiment analysis with U-SQL using the novel War and Peace as a source. She shows how you can view the results of the distributed processing in a visual similar to an execution plan – although it’s not entirely clear to me how this works, this looks very cool. The demo shows how her query breaks down the sentiment of each character in the book. The practical applications of this are numerous, and I look forward to learning more about the plumbing in place that drives this analysis.
Joseph returns to discuss deep intelligence. The example he shows is image analysis, which identifies objects within a photograph.
One new feature that just caught my attention is the Data Experimentation Assistant. Joseph says that this facilitates A/B testing for data migrations. The official announcement actually came out a couple of days ago. Note to self: Go check this out.
Now Jen Stirrup talks about PowerBI and Pokemon. Fun visualizations with analysis of Pokemon activity!
Joseph is up again to show a video about a visually impaired man using real-time visual analytics to report audibly through headphones what’s going on. The examples shown included his special glasses analyzing the people in front of him, describing through his headphones the gender, age, and even facial expressions (output: “a 40 year old bearded man looking surprised”). The app can also intelligently read restaurant menus to him, even inferring categories such as appetizers an entrees.
That’s it for the first day keynote. Look for another post here tomorrow for the second day keynote.