SQL Saturday #90 OKC

This past Saturday, I made the relatively short (3 hour) drive north to Oklahoma City to participate in their first ever SQL Saturday event.  I have to say that this was one of the best SQL Saturday events I’ve attended, which is especially notable since it was their first such event.  The venue they chose was the Moore Norman Technology Center, which was well suited for the site and layout of this event.  The big room used for the opening keynote by Steve Jones was subdivided (using remote-controlled motorized room dividers – very cool!) into the three rooms required for the three tracks.  Although one of the rooms had some audio and video issues, the facilities were more than adequate, and building staff were on hand to help out with the few issues that came up.  Lots of parking, wide hallways, and plenty of restrooms made the facility easy to navigate.

This SQL Saturday had just three tracks, the smallest number of any such event I’ve attended.  It worked out well, though – the tracks were broken out by discipline (DBA, dev, BI) so each timeslot offered a variety of topics. 

I got to present two sessions in the BI track: I did my T-SQL vs. SSIS session that I originally wrote for SQL Rally, and I also offered an introductory BI session that briefly covered SSIS, SSAS, and SSRS.  Each of my sessions was well attended – 48 in the first, and 66 in the second – with lots of audience interaction.  By the way, my slide decks and code samples are available on the Presentation Notes page of my website.

The OKC group hosted a speaker dinner the night before, as well as an attendee reception after the event on Saturday.  Unfortunately, the notification about the attendee reception was sent out late during the week before the event, and as a result there wasn’t a big turnout on Saturday.  However, a good number of speakers showed up on Friday night.  The OKC group was kind enough to buy a generous round of appetizers both nights.

This was a well organized event, so I don’t have much to critique.  Here are a few observations I made:

  • Signage was lacking.  Although the building was easy to find using GPS, I didn’t see any signs at all pointing to the building, which entrance to use, etc.  Fortunately, it was not a huge building, and even though I entered through the wrong door I was able to quickly find my way.
  • Volunteer shirts – I was asked a lot of questions about logistics and such (where are the restrooms? What time is session <X>?) simply because I had a speaker shirt.  I like having the event staff wearing different colored shirts so they can quickly be identified as someone in the know.
  • Communication. As I mentioned, the word about the reception went out late last week.  Even though it was branded as an “unofficial” event, I think there would have been a better turnout had the message gone out sooner.

Apart from those issues, I thought the event went very well.  Hats off to Matthew Brimer, Kristin Ferrier, and the rest of the Oklahoma City SQL crew for hosting an excellent event.

dallasAs an aside, we had a significant representation from the Dallas area at this event.  I didn’t get an exact count, but there were at least 10 speakers from the Dallas area, and at least another 10 attendees.  Among those in attendance:

Thanks again to the OKC crew for having us! Congrats on a great event, and we look forward to it again next year.

T-SQL Tuesday (er, Wednesday): Crap Code

TSQLWednesdayOk, I have two admissions that I must bare to the world in this post.  The first is that I’ve been a little lazy.  I’ve admired – from a distance – this T-SQL Tuesday thing ever since it started.  It’s been great!  A bunch of people blogging on the same day about their own interpretation of some common topic – brilliant!  Sadly, I’ve been too busy playing Angry Birds keeping up with my other responsibilities and haven’t made the time to add my own submissions to this phenomenon.  However, I’ve finally killed the last of those damned pigs caught up on my other work, so I get to let my hair down for a bit and join the rest of you fine folks in this thing.

The second admission is that I’ve rolled out some, uh, less than optimal solutions in the past.  Any developer who is truthful will admit to a few instances where we’ve rolled out solutions with poorly performing cursors, improperly indexed tables, and convoluted code, all accompanied by little if any documentation – quite simply, it happens from time to time.  But in the interest of the T-SQL Tuesday topic for this month, participants are asked to share about a time when they rolled out some truly…

Crap Code

Without a doubt, I could find a number of instances where I’ve written code that wasn’t worth the electricity it took to store it, but one such occasion really sticks out.  This was early in my IT career, close to 10 years and several jobs ago, and I was writing my first real production database application.  I fancied myself a bit of an expert – after all, I was completing my first relational database theory class in college, and I had a solid 18 months experience in SQL Server administration. 

I had talked my boss into letting me architect (!?!) a solution that would allow educators to create short web-based assessments, to replace the manual process (as in literal cut and paste – with scissors and glue) they were currently using.  With the requisite approval in hand, I began my discovery.  Yes, me – not only was I the architect, I was also the business analyst, developer, tester, and system admin. #winning

The Development

In meeting with the educators, I learned that the assessments would be very brief – at most, 10 questions each.  Educators would be able to share assessments with others, but could only edit or delete their own.  The final product would allow the educators to save to PDF and print the forms – there was no interest in actually delivering the exams online, which made things a bit easier for me. 

So as I set out into the design phase, I envisioned the entities I would need.  Since the number of questions would be very limited, I decided to use a single table to store the questions directly with the assessments.  (Don’t get ahead of me here – it gets better.)  I did manage to store the answers to the questions in a separate table – not because it was a best practice, but simply because I couldn’t figure out an easy way to squeeze them into dbo.InsanelyWideAssessmentAndQuestionTable.  Using my freshly minted ASP.NET skills – I had recently read two entire books about C# on ASP.NET – I started coding the front end.  A simple yet ugly interface, rudimentary file upload capabilities, and slow response time, but during my solo testing sessions, it did manage to do what I intended.

The Deployment

I’m certain I violated every test and deployment best practice ever written.  Source code consisted of a bunch of .zip files created at irregular intervals, and parallel testing involved two laptops on my desk, connecting to the web/database server under my desk.  Deployment was the easiest part – I just manually copied the web application files to the web server and restored the database from my desktop dev machine to the server as soon as everything seemed to function without errors.  What could go wrong?

The Meltdown

Did I mention that I was still in college at the time?  Two mornings a week, I drove to university about an hour away.  I had deployed the whole thing about 9pm the night before and e-mailed the instructions for system use to the key personnel.  It was no surprise that, in the middle of class the next morning, my cell started ringing.  We’ve got about 100 assessments in the system, the voice said, but we just discovered that the answers are associated to the wrong assessments!  Further, as the educators enter the data into the system, their entry was often associated with someone else, so they couldn’t go back and edit or delete it.

After clearing 60 miles of 2-lane road back to my office in record time, I started some very quick triage while trying to avoid a few dozen dirty looks from my users.  The problem, at least the main one, was that I was using incorrectly scoped variables to track user IDs in the ASP.NET application, which caused the assessments to be associated with the last user who had saved any item, and with more than 20 people entering data, there were more wrong than right.  Further, since tests/questions and answers were entered in two different steps, most the answers were also incorrectly linked.

In true rookie fashion, I tried feverishly to fix the error on the fly.  Now stationed in the midst of the users trying to enter data, I would stand up every 15 minutes or so and announce, “I’m recompiling, please log off!”.  This went on for maybe 90 minutes, at which point I – in probably the only wise decision I made that day – stopped the bloodshed and asked that we reconvene later after we were able to correct the development issues.

The Aftermath

I was ready to fall on my sword over the debacle.  After all, it was me and my ego that had caused the whole mess.  Fortunately, I worked in a flexible environment that allowed us to introduce reasonable risk, even if it meant the occasional failure.  Along the same lines, I was given the time to make it right: after two rounds of much more rigorous testing, I successfully deployed the updated (and this time, properly functioning) application several weeks later.  Still, despite the eventual positive ending, I was embarrassed to say the least, and lost a little bit of cred that day.

The Lesson

Was it ever.  I learned some hard lessons that day, lessons that I carry still to this day.

How did it change me?  Let me count the ways:

    • Database normalization. Learn it, live it.
    • Don’t be afraid to admit when you’re over your head, even if it’s late in the process.
    • Test, test, test.  A successful compilation isn’t the end of testing – it’s just the beginning.
    • Testing should involve business users, not just technical staff, and should simulate realistic usage as much as possible.
    • Never implement anything that you can’t be there to support on the day of go-live.
    • Don’t rush deployment.  Missed deadlines will be forgotten, but crappy applications can live on forever.

And the most important lesson I learned that day:

Mistakes are a part of life, but you must a) own your mistakes and b) learn from them.

To that end, the catastrophic deployment described here was actually one of the highlights of my career.  Never before or since has a single incident taught me so much.