How a Bacon Explosion is Made

Openness

Ive been encouraged to blog openly about the Volunteer work that I do for PASS and the processes we use to get the work done.  This blog is the first in what I hope to be a long series that will outline the different things that have to be done to bring a >somewhat< seemless experience to the SQL community

Pass Processes –what you (dont) want to know 

The final PASS Summit session evaluation results have finally been emailed out to the speakers.  This brings an interesting month and a half of PASS to a close for me.  Back on the 24th of november I asked for some help to get the Session evaluations together and generate some results.  As it turns out I had a huge outpouring of support from everyone wanting to help (Thanks again) In the end though, I wound up working with 2 volunteers: Tim Mitchell and Christina Leo, as well as Elena Sebastiano from PASS HQ to make this work…

 

To get to the end you have to start at the beginning

I’ve been involved with the program committee in various ways since 2006 so I have eval counts going back to 2005.  We’ve tried various ways of upping the evaluation return rate over the years but until 2009 we have had little luck in improving it.  This is a classic example of be careful what you wish for because you just may get it.  We had an amazing 336% increase in return rate of the evals. 

2005 — 3518

2006 — 2114

2007 — 2991

2008 — 2379

2009 — 8008

While the added evaluations will be of great use to everyone they created an unanticipated problem of having to manually enter these.  PASS hired a temp to enter the data and since we didnt have an accessible designed database to store the data in we decided to use zoomerang.  The sessions were entered directly into a zoomerang survey and the results were extracted into an excel file that was delivered to me. 

Once we had the session results in hand Christina went through the rather painful process of cleansing the data and getting it into a format that could be used.  The data was loaded into a SQL Server database, where Tim spent his time building an SSIS package to extract the data and put it into individual excel spreadsheets that could be emailed to the speakers. Once this was complete, I took a preformatted email Elena had wordsmithed for me and built an additional SSIS package that would read the email addresses from the db and send the attached excel spreadsheet.  This was an excellent opportunity for me to expand my SQL Skills.  I dont get to use SSIS in my current position, I always learn better when I have a real problem that needs solving so I enjoyed the work.

All was going perfectly, I was about to move on to my next task when the emails started to flow in with speakers asking where their results were since their spreadsheets were blank.  This caused me to absoluetly PANIC.  I immediately started to verify where the mixup was, when managing a process with so many moving parts theres always a chance that it was something in the process.  After verifying that the evals werent in the original dataset, I felt quite a releif as it wasnt something in our process that ate the evals, it was something far more sinister…

The case of the missing evals

I contacted HQ about the issue first thing in the morning and they were obviously thinking the worst as was I.  A few phone calls and emails later the options were “lost in the zoomerang DB”, “entered incorrectly” or “lost in transit”.  I wound up getting an email at about 9PM titled “Crisis Averted”, even though the crisis want any of my doing, you can imagine the relief when I heard that the an envelope(s) containing just over 1400 evaluations had been found at HQ.  They were apparently misplaced during the transit of the hundreds (thousands??) of boxes returning to HQ from the summit. 

Now comes the hard part

About 48 hours later I got a new extract with all of the missing data in it.  I only assume because of the speed we got these 1400 abstracts that every free hand at HQ was working furiously to get them entered.  As it turns out Christina was in Europe and unavailiable to recreate what she had originally done and Tim was busy so I took on the task of recreating the process that was done the first time.  Luckily, I had the source to Tim’s SSIS so that wouldnt be too much trouble.  After about 5 more hours of work I had the data loaded into the proper taables and ready to be reported on.  The process was updated and everything was re-run and with that, all the speakers got their evals and were happy, Success!!

Reporting on the data

I proposed that we generate a page for the summit09 site that had the top 10 sessions and other various data/matrixes for use primarily by the speakers.  In the end it turned out that this info is very valuable to PASS for generating interest in the quality of the educational opportunities at the summit.  Since there is a value add we had to work around how to “properly” release this data.  Not a big deal, just an aspect some members of the community might not have even thought of. (I know I hadnt thought of it) 

 

The Grand Finale

 

The link which I hope will be of some interest to both speakers and potential conference attendee’s

Here you should find the top 10 sessions overall, the top 5 sessions per track as well as all sorts of data that I extracted from the evaluation database.  its also worth noting that these pages directly link to the presentations (and recordings for summit 09 attendees) so you can relive the best of the best today.

Did I miss something that you think is valuable?  let me know and Ill see about getting it added!!

http://www.sqlpass.org/Events/BestOfSummit.aspx

 

Takeaways

PASS has some very “interesting” processes that backup the front end and thre is definetly room for improvement, the biggest issue is how do you improve a process such as this one without spend very much (any) money? 

We need to design a database >gasp< to hold the speaker eval information and not rely on a 3rd party that only exports to excel

We’ve already enacted a change for 2010:The registration group will enter the evaluations from the paper immediately after they are collected, this should kill the delay in getting results back to the community.  We >may< also go to a split online/paper eval process but, im hesitant to mess with a process that we had such a huge improvement , especially after earlier trying an online process with less than stellar results

If we combine these 2 items, I think it would be outstanding to have a realtime update on the main PASS website during the summit of what the top 10 sessions have been, and maybe even a “reserved slot” for a repeat of the top session per track?

The scoring system that we used to deliver the results (very poor, poor, average, very good, excellent) did not work well, we will go back to using only numbers 1-5 next year.

I’d estimate that I spent somewhere between 50 and 60 hours completing this task, and ill admit that some of that was learning new things in SSIS, but youd be amazed how many emails it took to put this piece of info out for all to see.

 

Photo courtesy of Kristin

 

You’re still here?  did you really read all of this? Well if you did, you should really think about taking up kniting or some other worthy passtime =)