Posts tagged How the sausage is made
What has PASS been up to?
Ever find yourself with tons of extra time just looking for something to dig through?
yeah, me neither… But, I do make it a point to go out and read through lots of PASS documents regularly. Sure, Some of those documents are not for public consumption but, a large portion of them are available for any PASS Member to view. Almost all of them will require you to be logged in to the PASS site.
A good starting point is the PASS Governance Page <- lots of good stuff hides on this page, Im working on getting this page removed from behind the login wall
PASS BOD Meeting Minutes are posted on the left hand side
The Feb 2011 Minutes are here
- Good discussions in here about Globalization of PASS, especially revolving around events
The Jan 2011 Minutes are here
- This was an in-person meeting and there is a literal ton of info in here. Highlights are globalization, Summit 2011 Planning, Summit 2010 Post mortem, 5 Year plans, Bylaw Changes
PASS Monthly Reports are found in the middle on the left
These are gems that reveal the day to day inner workings of the BOD and HQ
The Feb report should be posted in the next day or 2
The Jan report however, is here
- In here You’ll find things about Chapters, IT Projects, Marketing initiatives, ERC info, Sponsorship Sales, Summit Program, SQLRally, Gloablization, etc
The Dec report is here
- This one contains things like Chapter info, HQ Finance, IT Projects, Marketing, Summit, Rally, 24hop, SQL Saturday,
The budget for PASS is included at the bottom of the governance page
2011 Budget is here
- Wanna know where the money is supposed to be coming from, and where its supposed to be going? this is where to look.
- Side note: Im going to check into where the 2010 audited financials are, they should be available by now.
The SQL Rally has posted all of the planning meeting notes posted here
- There is tons of good stuff in here, its especially interesting to me to watch the minutes back and forth dealing with very familiar problems as what I’ve seen in the Summit program group.
- Wanna know how many attendees are registered so far for the Rally? yup its in there. Wanna know how many are in Precons? yup its in there too
We (PASS Program) started posting meeting minutes near the lower left side of this page
- I have written about these minutes before
- Good information in here about many new changes that are being considered by the Program Committee
- Essentially It says that I’m not getting nearly enough done for the program committee lately. I need to work on that!
- Im including this here because lost of good stuff gets posted here but, for me I can only find it since its in my RSS Reader.
In Summary, PASS releases a ton of information about what its doing. The problem with this is two-fold, one its a ton of information. Two, the information is spread out all over the place and is often difficult to find on the site using conventional browsing methods so I hope this helps
Every year PASS asks the speakers at the Summit to agree to some relatively simple terms and conditions. I don’t consider them to be anything overly involved or overbearing. For those who haven’t seen them they basically establish that a speaker owns the content they are going to present, that the speakers act as professional as possible, don’t market their products, or their companies products, and allow PASS to record the sessions.
This year the hangup for me is related to that last tiny bit. For regular conference speakers asking them to allow recording of their 1 hour session isn’t a big ask. However where Im reevaluating what we’ve done in the past is related to the all day preconference sessions.
Last year PASS recorded the preconference sessions and offered them for sale to PASS members. Just like the preconference sessions where the speakers get a portion of the admission fee, the contract called for the speakers to get a portion of the sales from the DVD’s. At the time this seemed like a fair way to do things and I still believe that the revenue share is fair.
Ive heard from several different people that if these preconference sessions are recorded that it may become more and more difficult for PASS to attract the top tier SQL Server speakers to do precons. I can appreciate the position of some speakers on this, if they are giving their best content and we are distributing it digitally for what amounts to a few hundred dollars they run the very real risk of loosing actual sales of training material, or potential clients.
On the other side, I need to weigh the risks of potentially shrinking the pool of available speakers with the benefits to the community of being able to offer these recordings. The other benefit is of course the money PASS makes from these DVD sales. To be perfectly clear, the amount of money PASS makes off of DVD sales in general is merely a pittance in the scheme of things. Having the DVD’s available and leveraging the content however is very valuable to our members and something that I think is important enough to at least explore what can be done to hopefully find a good balance
Decisions, Decisions, Decisions
The way I’m leaning on this is to leave things the way they are and see if we see an overall drop in the quality or quantity of our preconference presenters in 2011 onward. I have however thought a lot about possible ways we could create a workable model, where we allowed certain preconference speakers to opt out of recording. This could get really messy administratively, and cause some confusion/anger with attendees not knowing which sessions will be included in the recordings. The other alternative is to just stop recording preconference sessions totally, although I dont think this is a good option.
I guess what I’m trying to do here is expose an internal debate that Ive been having with myself. Ive found that often if I spend the time to write something out it helps me organize my thoughts. As a bonus occasionally, I get great comments/ideas from the 2 of you who read this.
The results are in!!!
After tabulating over ten thousand distinct session evaluations for the 2010 PASS Summit we are pleased to release the top 10 sessions overall and the top 5 sessions per track.
Getting these session results generated and out to the speakers in a timely manner is always challenging. After taking until the second week of January 2010 to return Speaker Evaluations for the 2009 Summit we put in sweeping changes to prevent that from happening again in 2010.
Fortunately we were very successful in getting the data, We (Community Volunteers) designed and built a database to house the eval info, and designed a system that could be used to enter the evaluations quickly during and shortly after the Summit. This was a resounding success. Unfortunately where we fell short was in executing on delivering the data to the speakers and the community. When we designed these systems, the process to send out the evaluations wasnt really discussed, or possibly just wasnt finished (the perils of distributing work include less insight into exact issues). Either way, I wound up in the 23rd hour reworking last years SSIS package to fit the new database schema.
We delivered Speaker evaluations to the speakers a full 3 weeks earlier than last year. This included additional info about overall speaker scores that we had never provided in the past. I realize a success to me (3 weeks sooner) is still a failure to others (4 weeks after the summit to get the data to the speakers) We’re going to be working on improving this for next years summit but for now, Ill take the wins where I can get them!
Getting the top 10 sessions posted has taken an extra 3 weeks. I take full responsibility on this one. I had the data on my laptop for the entire time, at first it was the holidays, then it was something shiny, after that I kept running into issues trying to make queries that werent just usable for this years summit, but would be able to generate similar results for any event we enter into this database. In the end though, I have a set of queries for this process that will be reused.
This database/process was one of the projects a large group of OUTSTANDING Community members chipped in and worked on under the umbrella of the program committee in 2010. I have big plans to round up another set of volunteers and put a web based front end on the db and push its use out to all SQL events that would like to use it. The information that we’re gathering will be invaluable to both the speakers and to the community in the future.
Oh, Hey, Dear reader if you’re still reading this far into the babble I guess your looking for the Best of PASS Summit 2010 Link right? Without further ado…..
PS: No, Adam Machanic it didnt take 257 weeks to get this out
Its been a while since I wrote an update about whats been happening in the PASS Program Committee. I just havent had time to write about it with all of the work thats going on in addition to my regular day job. Hopefully Ill have time now to do a better job at this!
The annual content survey was sent out and the results are in, I’d like to go on record now and say, Im not a BI user/admin/developer. We took the BI questions from last year’s survey (which were obviously from 2008). Unfortunately, while going through them and updating the questions I didnt reach out to a BI person and get a gut check for the BI questions. So we wound up with some out of date info in that section. I swear we like BI @ PASS, I just goofed, there’s not some secret conspiracy, and YES to the 1 of you who asked, I do read all of the comments . The good news, for those that asked, the survey results will be released as soon as we can get them collated and readable (any day now) **UPDATE** The survey responses are here there are definitely some very interesting tidbits to be mined from this.
We are making progress in working on several projects, from redoing the speaker resources, to developing a new system to house the speaker evaluation data. As with all things volunteer driven, these tasks are taking time but thats not unexpected.
The biggest project Ive been spending my time on is the call for speakers. The call for speakers (and resulting abstract review site) is always a huge undertaking. This year it seems to be even more magnified since we’re undertaking a new vendor (the same 1 that does tech-ed). There have been quite a few bumps in the road along the way (I wont bore you to tears with all the details) A steady diet of 1-2 conference calls a week and about 50-100 emails a week and we’re closing in on a useable product in the call for speakers site. The abstract review site, well that will be the subject of a whole other blog post in the future! Ill just say that right now Im hoping to find some spare pixie dust or at least a few extra rolls of duct tape and bailing wire prior to the close of the call for abstracts
There have been many discussions about changing some of the SOP in the program committee, I have blogged about some of those previously so I wont rehash those here again. Ill just add a few more ideas Ive been kicking around.
1 of the largest things that will effect the average attendee at the Summit is that we’re exploring ways to allow 2 new session types this year.
1) Community selection – The current thought is to allow the community to choose from (pre filtered) submitted abstracts to choose a session per track (or some similar method/amt)
2) Best of the Summit– The current thought is to take the top session(s) from the first 2 days of the summit and repeat them on day #3
Both of these ideas have execution issues to overcome but, I think they should be doable for the 2010 summit.
Ive been encouraged to blog openly about the Volunteer work that I do for PASS and the processes we use to get the work done. This blog is the first in what I hope to be a long series that will outline the different things that have to be done to bring a >somewhat< seemless experience to the SQL community
Pass Processes –what you (dont) want to know
The final PASS Summit session evaluation results have finally been emailed out to the speakers. This brings an interesting month and a half of PASS to a close for me. Back on the 24th of november I asked for some help to get the Session evaluations together and generate some results. As it turns out I had a huge outpouring of support from everyone wanting to help (Thanks again) In the end though, I wound up working with 2 volunteers: Tim Mitchell and Christina Leo, as well as Elena Sebastiano from PASS HQ to make this work…
To get to the end you have to start at the beginning
I’ve been involved with the program committee in various ways since 2006 so I have eval counts going back to 2005. We’ve tried various ways of upping the evaluation return rate over the years but until 2009 we have had little luck in improving it. This is a classic example of be careful what you wish for because you just may get it. We had an amazing 336% increase in return rate of the evals.
2005 — 3518
2006 — 2114
2007 — 2991
2008 — 2379
2009 — 8008
While the added evaluations will be of great use to everyone they created an unanticipated problem of having to manually enter these. PASS hired a temp to enter the data and since we didnt have an accessible designed database to store the data in we decided to use zoomerang. The sessions were entered directly into a zoomerang survey and the results were extracted into an excel file that was delivered to me.
Once we had the session results in hand Christina went through the rather painful process of cleansing the data and getting it into a format that could be used. The data was loaded into a SQL Server database, where Tim spent his time building an SSIS package to extract the data and put it into individual excel spreadsheets that could be emailed to the speakers. Once this was complete, I took a preformatted email Elena had wordsmithed for me and built an additional SSIS package that would read the email addresses from the db and send the attached excel spreadsheet. This was an excellent opportunity for me to expand my SQL Skills. I dont get to use SSIS in my current position, I always learn better when I have a real problem that needs solving so I enjoyed the work.
All was going perfectly, I was about to move on to my next task when the emails started to flow in with speakers asking where their results were since their spreadsheets were blank. This caused me to absoluetly PANIC. I immediately started to verify where the mixup was, when managing a process with so many moving parts theres always a chance that it was something in the process. After verifying that the evals werent in the original dataset, I felt quite a releif as it wasnt something in our process that ate the evals, it was something far more sinister…
The case of the missing evals
I contacted HQ about the issue first thing in the morning and they were obviously thinking the worst as was I. A few phone calls and emails later the options were “lost in the zoomerang DB”, “entered incorrectly” or “lost in transit”. I wound up getting an email at about 9PM titled “Crisis Averted”, even though the crisis want any of my doing, you can imagine the relief when I heard that the an envelope(s) containing just over 1400 evaluations had been found at HQ. They were apparently misplaced during the transit of the hundreds (thousands??) of boxes returning to HQ from the summit.
Now comes the hard part
About 48 hours later I got a new extract with all of the missing data in it. I only assume because of the speed we got these 1400 abstracts that every free hand at HQ was working furiously to get them entered. As it turns out Christina was in Europe and unavailiable to recreate what she had originally done and Tim was busy so I took on the task of recreating the process that was done the first time. Luckily, I had the source to Tim’s SSIS so that wouldnt be too much trouble. After about 5 more hours of work I had the data loaded into the proper taables and ready to be reported on. The process was updated and everything was re-run and with that, all the speakers got their evals and were happy, Success!!
Reporting on the data
I proposed that we generate a page for the summit09 site that had the top 10 sessions and other various data/matrixes for use primarily by the speakers. In the end it turned out that this info is very valuable to PASS for generating interest in the quality of the educational opportunities at the summit. Since there is a value add we had to work around how to “properly” release this data. Not a big deal, just an aspect some members of the community might not have even thought of. (I know I hadnt thought of it)
The Grand Finale
The link which I hope will be of some interest to both speakers and potential conference attendee’s
Here you should find the top 10 sessions overall, the top 5 sessions per track as well as all sorts of data that I extracted from the evaluation database. its also worth noting that these pages directly link to the presentations (and recordings for summit 09 attendees) so you can relive the best of the best today.
Did I miss something that you think is valuable? let me know and Ill see about getting it added!!
PASS has some very “interesting” processes that backup the front end and thre is definetly room for improvement, the biggest issue is how do you improve a process such as this one without spend very much (any) money?
We need to design a database >gasp< to hold the speaker eval information and not rely on a 3rd party that only exports to excel
We’ve already enacted a change for 2010:The registration group will enter the evaluations from the paper immediately after they are collected, this should kill the delay in getting results back to the community. We >may< also go to a split online/paper eval process but, im hesitant to mess with a process that we had such a huge improvement , especially after earlier trying an online process with less than stellar results
If we combine these 2 items, I think it would be outstanding to have a realtime update on the main PASS website during the summit of what the top 10 sessions have been, and maybe even a “reserved slot” for a repeat of the top session per track?
The scoring system that we used to deliver the results (very poor, poor, average, very good, excellent) did not work well, we will go back to using only numbers 1-5 next year.
I’d estimate that I spent somewhere between 50 and 60 hours completing this task, and ill admit that some of that was learning new things in SSIS, but youd be amazed how many emails it took to put this piece of info out for all to see.
Photo courtesy of Kristin