Tim Mitchell
Follow Tim Mitchell on Twitter  Like Tim Mitchell on Facebook  Subscribe on YouTube  Connect on LinkedIn  Subscribe to the Data Geek Newsletter

24 Hours of PASS – Recap

At times it felt like a party, it had enough content to be a mini-conference, and we learned that some people get a little punchy after 24 straight hours of SQL Server.  Regardless if you were a presenter, a casual observer, or stayed engaged for the whole event, the 24 Hours of PASS event for 2009 was a memorable experience.

I was fortunate to have participated in several sessions, but several recent family illnesses and a late project kept me from getting engaged with all 24 sessions as I had initially planned.

I started out by listening to Allen White’s session on PowerShell for SQL Server. Allen’s depth of knowledge and excitement about this topic shone through, and I’ve listed several things that I want to try based on his presentation.  Notable takeaway: PowerShell can be used to browse a SQL Server object hierarchy much like a filesystem.  Very cool!

Two nontechnical presentations were particularly good.  Having moved into a team lead position last year, I found Kevin Kline’s “Team Management Fundamentals” discussion to be particularly valuable.  I also listened to Steve Jones’s presentation, “Building a Better Blog”; for anyone working to improve the quality and consistency of their blog, a lesson from a hard-core blogger such as Steve is not to be missed.  Notable takeaway: Short blog posts are fine, but blogging consistently (weekly or even monthly) is critical.

I was able to catch one of the two SSIS presentations.  John Welch presented “Delivering Good Performance with SSIS”, which had several good tips for those with beginning to intermediate skills in SSIS.  Most people working with SSIS focus on the mechanics and accuracy of ETL operations, and don’t spend a lot of time optimizing data flow.  Notable takeaway: Be aware of nonblocking, partial blocking, and fully blocking transformations in your packages, as they can significantly impact performance.

Although I missed a good part of it, Adam Machanic’s session entitled “SQLCLR or T-SQL? A Brief Survey of Performance Options” was helpful.  He demonstrated some scenarios where T-SQL outperforms SQLCLR, and vice versa.  I also missed most of the session “Embed Reporting Services Into Your Applications” delivered by Jessica Moss, but I saw enough to learn that SSRS can be used with Windows applications as well as web apps.  I got a refresher course on recovery models in the session “What’s Simple about Simple Recovery Model” from Kalen Delaney, who is a wealth of information on SQL Server internals.

There were a few sessions I missed entirely that I wish I’d been able to dig into.  There was a text mining session in the middle of the night (CDT) that looked very interesting.  I know very little about SQLdiag, so I wish I’d been able to attend Brad McGehee’s session on that topic.  MVPs and fellow SSC’ers Gail Shaw and Grant Fritchey discussed indexing and performance tuning, respectively, and I’m sure they both hit home runs.

There were a couple of things that could have gone better.  First, several of the session the links listed on the PASS website were incorrect.  This issue may have been minimized by the prolific use of Twitter, since people were able to ask for a valid link when the original one didn’t work.  Brent Ozar also posted a correct set of links during the presentation, but many were still using the PASS site (as one would expect).  I’m sure everyone had their hands full, but having a person on standby for those logistical issues is important when the link is the very gateway to the event.

Also, Tom LaRock had a feed of his own for the duration of the 24 hours; I started off watching both the session and Tom, but between the commentary and the echo on his audio became too districting, and I turned off his feed after just a couple of sessions.  I like the idea behind this – it gives the event a more personal, interactive feel – but I’d suggest reserving more of the commentary for the Q&A time or the gap between sessions.  And Tom, I’ll spring for a comfy set of headphones for next year to avoid the cursed echo… 🙂

All of the sessions were recorded, though that fact wasn’t widely advertised before the event – a wise move, in my opinion, since many would have skipped the live event if they’d known that they could watch the rerun later.  Some of the recorded sessions will be available for viewing as early as next week, with all of them ready by late November or December.  If you missed any of these sessions (or if – gasp! – you skipped the whole thing), keep an eye out for these recordings on the PASS website.  I’ll definitely go through and catch the ones I missed, and will likely replay the ones I attended when I can fully focus and make notes for myself.

I have never heard of another group having this type of unique content delivery.  Pushing out sessions for 24 hours straight shows the kind of fresh, community-focused ideas coming out of PASS these days, and will hopefully provide incentive to get engaged for SQL Server professionals who are not currently involved with PASS or a local chapter group.  This organization is focused on growth, and it’s a very exciting time to be involved.  If you’re not a member, join PASS today!

[Cross-posted from SQL Server Central]

About the Author

Tim Mitchell
Tim Mitchell is a business intelligence and SSIS consultant who specializes in getting rid of data pain points. Need help with data warehousing, ETL, reporting, or SSIS training? Contact Tim here: TimMitchell.net/contact

Be the first to comment on "24 Hours of PASS – Recap"

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: