Cool New Features in SSMS – Recent Expensive Queries

Depending on whether you are more of a developer or more of a DBA you may care more or less about this next feature.  However, anyone can benefit from it.  SSMS 2008 makes it very easy to crack open the Activity Monitor as there is a noticeable button right on the toolbar.

image

Once inside you get a few charts, as well as some tables of performance related data below.  This is clearly not as exciting on my local Vista machine as it would be on a pounded production database, but you can see where the power of a tool like this could come in handy.

Each of the charts represents these metrics:

  • % Processor Time
  • Waiting Tasks
  • Database I/O
  • Batch Requests/Sec

Probably the most interesting feature to me is the Recent Expensive Queries table.  This allows you to dynamically analyze the most impacting queries against your server.  Since the impact can be determined by a number of factors, it allows you to sort by:

  • Executions
  • CPU
  • Physical Reads
  • Logical Writes
  • Logical Reads
  • Average Duration
  • Plan Count

Even cooler, you can drill into the actual query itself with a right-click option:

image

JSON Lives

It has always been interesting to me how long it can take a technology to become truly rooted in the minds and toolsets of the software development community.  This mandatory waiting period for technological stability seems to be just long enough that most of the fads fall through the cracks.  Many tech buzzwords get thrown around, and there is no shortage of hype.  Web 2.0, AJAX, WCF, WF, Ruby on Rails, NUnit, TDD, Scrum, Agile, Pair Programming… If you take all of the topics of PWOP productions over the past year, I see a bunch of stuff that is undeniably cool, but typically not what I use in my day-to-day work.  With a few notable exceptions, a majority of the newer technologies have not made it into corporate IT departments just yet.  Actuarial types like to see the riff-raff settle out before they build their infrastructure on top of it.

Most of what you see fly though your RSS reader or pop up on your iPod are curious glances at interesting ideas that will ultimately be a blip on the radar screen of 21st century computing.  To provide an example: I am a huge fan of CruiseControl.NET, but I believe it’s days are numbered.  It is only a matter of time before Microsoft refines TFS, integrates it into Visual Studio, and makes it cheap enough that you’d have to be a zealot to ignore it’s utility.  Does anybody remember Netscape?

Now I am not trying to make a point that Microsoft eventually gobbles everything up, and they are masterfully elegant at copying other people’s great ideas.  Everyone already knows that.  My point is that it is possible to expend a whole bunch of brain cycles on stuff that does nothing more than… take up your brain cycles.  So how do you pick the winners?  How do you know what to learn today so that 2 years down the road you will be the expert in the technology that is threaded through the fabric of all things software?  How could I have known 10 years ago that time would be better spent learning XML than COM?

I don’t know.

But what I do know is that I like the simplicity and the light natured format of JSON (JavaScript Object Notation).  It seems like something that satisfies the requirements of what 95% of XML is used for, without the verbosity.  The simplicity of a URL querysting like name value pair with just enough governance and convention to provide some stability.

JSON is essentially made up of objects, arrays, and values.  Values can come in any of the following flavors:

  • string
  • number
  • object
  • array
  • true
  • false
  • null

So you are probably starting to see the recursive nature in that values can be objects and arrays.

var tacos = {
"breakfast":
[ {"meat":"sausage","filler":"egg","quantity":1}

,{"meat":"bacon","filler":"egg","quantity":1}]

"dinner" :
[ {"meat":"beef","filler":"beans","quantity":2}

,�à {"meat":"chicken","filler":"lettuce","quantity":1}]
}

Although there are several ways to describe this in XML, the most common would probably be

<tacos>

<breakfast>

<meat>sausage</meat><filler>egg</filler><quantity>1</quantity>

<meat>bacon</meat><filler>egg</filler><quantity>1</quantity>

</breakfast>

<dinner>

<meat>beef</meat><filler>beans</filler><quantity>2</quantity>

<meat>chicken</meat><filler>lettuce</filler><quantity>1</quantity>

</dinner>

</tacos>

Of course there are various ways to represent this in XML, but the example above shows how XML can be quite a bit more verbose than JSON.  The JSON example has 223 characters while the XML example has 304.  That is about a 25% savings by using the JSON data format.  Because of the importance of quick response times for background http requests, JSON has taken off in the AJAX community more quickly than in other circles (which is ironic because the X in AJAX means XML :)).

So is JSON worth spending time on?  Will it develop schema support as robust as XML and develop into the next superstar that shows up as the format for configuration files in Visual Studio 2010?

What do you think?

Data in the Cloud? Not For Me…

As I was cheerfully driving into work and auditorily inhaling my daily dose of .NET Rocks, I heard a rather compelling discussion between Scott Cate and Carl Franklin/Richard Campbell.

Entry page for Shoope

The site mentioned in the podcast was called Shoope, which I have to say is kind of a stupid name.  Partially, I may be thinking that name sucks because of an annoying little catchy phrase I can’t get out of my head.  Since I have tried to register for the beta unsuccessfully about 10 times over a span of 10 hours to try out the mystical data store, my mind keeps heckling, “Shoope is Poop!”  The logo only bolsters this unfair moniker by having the “p” deficate some mysterious pile of goo.  What’s up with that?

Now I am pretty sure that this is not the case due to the caliber of people involved on the project.  Scott Cate and Rob Howard seem to be very intelligent guys, and I am sure they have teamed up with some other really talented developers for this new concept.  So why do I get a SQL timeout every time I try to see what they’re up to?  Is it the overwhelming popularity of the announcement on .NET Rocks?  Cate opened up the beta testing to the first 5000 listeners that registered with code “DNR” on the website. 

So what is Shoope?  It is an online data store that put’s your data “in the cloud” for access and sharing anywhere.  They provide a set of dynamic services that allow you to access, modify, and search across your data.  I haven’t been able to find out much else online as things seem to be just out of the gate with the beta testing.  My efforts to try out the beta will continue as traffic hopefully dies down, but I haven’t been able to determine much else about it at this point. 

I can appreciate the volume of traffic that anything mentioned as free on DNR would generate.  After all, I don’t even have a use case for Couldn’t get in over a span of 10 hoursShoope I just wanted to see what it was all about.  Nonetheless, I was underwhelmed by the fact that they didn’t either anticipate that volume of traffic or even handle it once the site started cratering.  The thought of throwing your data out “into the cloud” is a discomforting one, and this doesn’t do much in the way of building warm fuzzies.

Do Not Use the StringBuilder

I know there are plenty of people out there that will disagree with me, but I urge you to not use the .NET StringBuilder class in 99% of the cases you would be tempted or recommended to do so.  Here is why:

 The StringBuilder class does perform better than basic string concatenation.  However, it is really only noticable with ridiculous string lengths.  For example, if you are concatenating a SQL statement together for a statement that has less than 100 columns, the difference will be negligible.  Slamming 10,000 strings puts StringBuilder as the clear winner in time as it will shave about 2 tenths of a second off of your processing time.

Testing In Progress

I ran some tests, and comparing a basic concatenation to using a string builder to slam together 100 “A” characters is a wash.  Sometimes the StringBuilder even took longer.  At 1000 characters, StringBuilder was a slight winner.  Of course this amounted to about 20 milliseconds on average.  Unless you are doing a lot of concatenation, this won’t really be a noticable difference… and we are already talking about 1000 operations!

Below are the results of my test (click on thumbnail) that were not posted on my original rant, but have since been added to this post.  You can see the strings of varying lengths I used as well as the number of concatenation operations that were performed on each.  The “Total String Length” column represents the overall lentgh of the resulting string after all operations are complete.

The Results of My Tests

As you can see, nothing short of  millions of resulting characters or tens of thousands of concatenation operations ends up in response time that is over 1 second.  Looking at a typical operation in my world such as concatenating a SQL statement together, we would be dealing with something like rows 9 and 10 of the above table.  These tests took the string “AND COLUMN1 = ‘VALUE’” and concatenated it together 1,000 times.  Although this would still be a ridiculously long SQL statement (and probably perform horribly :)), the difference between using a StringBuilder (193 ms) and simple concatenation (214 ms) is only 41 milliseconds difference.  Yes, I realize that is about 20%, but again, if you are doing this concatenation many times, you may need to write your own custom object to meet your true needs.  Another interesting find is that the # of operations seems to have a much bigger impact than the overall length of the string itself.

Yes, it is terrific that somebody at Microsoft realized working with a collection (string) of hundreds of thousands of members (characters) will not perform will if it is not indexed.  No kidding!  But who in their right mind will be dealing with strings of 500,000+ characters?  If you are doing that, then you probably need to rethink your approach to solving the problem at hand.

 Because of the added complexity, additional lines of code, developer ramp-up possibility (for those not exposed to the class or .NET) and risk of change in future versions of .NET, I say you should not use the StringBuilder class… unless, of course, you find yourself needing to deal with that all to common million character string 🙂