VSTSC's Community Server

Community Server: The platform that enables you to build rich, interactive communities.
VSTeamSystemCentral: Your Team System community!


Welcome to VSTSC's Community Server | | Help
in Search

Tim Benninghoff

A SQL Server DBA attempts a personal IT project and documents the (mis)adventures.

  • Blogging about blogging

    Here I am, about 5 months into blogging in 2009, and I find myself unsatisfied with the frequency with which I update this blog.  Sure, there is some content with which I am well pleased, but I just can't seem to understand why I can barely reach the 1 post-a-week goal I had set for myself back in January.  What gives?

     So, over the past 2 weeks I started to seriously think about the entire process I use to decide whether to blog about a subject or not, and I realized that I ask myself a number of questions whenever an idea for a blog post floats through my grey matter.  The questions go something like this:

    Has it been blogged already?  If I google the subject, will I find the answer out there already?  Am I contributing to the discussion in any way, or just adding to internet detritus?

    Opinions are like....   Everyone's got an opinion.  What makes mine so special?  If I can't come up with an angle, I won't post it.

     Would I want to read about that?  If I were a follower of my own blog, would I want to read what I'd posted?

    And I find that, while on any given day I can usually come up with 3 or 4 ideas that might make a passable blog post, most of them never make the cut because they can't make it through the preliminary examination questions.  I'm not terribly satisfied by that result.

    Yes, the internet is one giant data store, but even those of us with the greatest of google-fu know that it is sometimes very difficult to find an answer to a particular question.  And, if you'd bear with my poetic license for a moment, data and information isn't just a reservoir we go to when we need something, it's the stuff we swim through.  It's the conversation I had today with my wife about carbohydrates, it's the facebook message I got from an friend about a place we used to both work, it's the question someone asked on Twitter about SQL.  And so, dealing with data is both a combination of looking for it, and it finding us when we least expect it.  When I self-limit what I blog about, am I preventing data from finding someone?

     So, as a warning to my kind readers, I think I'm going to experiment for a month by NOT asking myself those little questions about whether now is a time to blog, and see if that gets me to a happier blogging place.  Pardon the dust.

  • Creating a Reporting Services rdl from PowerShell

    It's no small secret that I'm a big fan of using PowerShell to manage SQL Servers.  I've gone through all the trouble of loading the SQL Server assemblies and/or configured PowerShell to use the SQL Server provider, and I've jumped into SMO, sometimes to much joy, and sometimes to much sorrow.  But, the other day I started to wonder, 'Where's the object model for Reporting Services?  And, being a big fan of the Script method from SMO, can I script out an rdl of a report?'

     I started digging deeper, and sadly I could find no assembly or aspect of the SQL Server provider that would grant me easy(?) access to the Reporting Services objects.  So, I hit the documentation to find out why.

    As Books Online was eager to point out, Reporting Services is fundamentally a web service that is accessed via the Simple Object Access Protocol (SOAP) over HTTP.  Huh?  Well, our friend BOL tries to explain further, stating that the web service has endpoints that you can connect to in order to either kick off reports or manage your report server.  So, great, how do I connect to a web service?

    The internet abounds with suggestions, most of which revolve around using the Visual Studio and the .NET Framework's SDK in order to create a WSDL in order to create a proxy of the web service, thereby gaining access to the objects and the methods therein.  And, as I read those suggestions, I began to empathize, ever so briefly, with developers who must likewise look at relational databases as if there were some sort of horrible, horrible spawn of darkness.  Mainly, though, I was wondering how I was going to get around this problem without having to hit up my boss for a license to Visual Studio, and then spending hours to learn the ins and outs of web services, proxies, and using the great and glorious WSDL.exe to create a WSDL file.

     But, thankfully, the PowerShell community generally rules, and I was able to find a nifty little PowerShell script by Oisen Grehan which allows me to create a proxy to my Reporting Services without all the aforementioned yuckiness.  So, having downloaded the script, scanned it for a basic understanding of what's going on, and then placed it in my trusty C:\PowerShell directory, I fired up PowerShell and ran the following:

    $url = "http:///ReportServer/ReportService2005.asmx"
    $proxy = C:\PowerShell\New-WebServiceProxy.ps1 -Url $url -SoapProtocol Soap

    I provide my credentials at the prompt the script generates, and now, suddenly, I have access to information regarding the properties and methods I'm looking for on my Reporting Services installation with the simple yet powerful:

    $proxy | gm

    OK, great, so now we can access our Reporting Services objects, but what about generating that rdl for a report?  If I know the report name, including it's full deployed path of folders, I can use the GetReportDefinition method to get a byte array of the report:

    $report = $proxy.GetReportDefinition("//ReportName")

    And, then it's just a matter of converting that byte array to text, and writing it out to a file:

     [System.Text.Encoding]::ASCII.GetString($report) >> ReportName.rdl

    Still, I think it would be much nice if the SQL Server PowerShell provider made it possible to accessing Reporting Services as part of the SQLServer: PSDrive.

  • Object Limit and SQL PowerShell Provider Answered

    In a previous blog post I talked about being puzzled as to why I wasn't able to get, in my mind, accurate table counts for a database in SQL Server 2005 that had over 50,000 tables.  I kept getting only 1,000.

     With the help of Chad Miller, I've figured it out.

    To reiterate, when I had navigated to the PSDrive for the tables within one of my databases and had run the following command:

    (get-childitem *).count

    I got 1000, even though I clearly had more than 1000 tables in my database.  But, a curious thing happens when I alter the command slightly:

    (get-childitem).count

    I get the full 50,000+!?!?  So, what's going on?

    Well, it looks like one of the other global variables for the PowerShell provider,  SqlServerMaximumTabCompletion, is getting into the mix.  In my particular environment I have this set to 1000.  Coincidence?  Apparently not.  It seems that the use of the wildcard, *, is limited by this global variable, for when I change SqlServerMaximumTabCompletion to be equal to 0, which is unlimited, (dir *).count now returns the correct number of tables.

    Likewise, when I use wildcards to return a list of tables that contain a certain string in the name:

    get-childitem *ROLE*

    I now get all of the tables I'm expecting.

    It seemed counter-intuitive to me at first, but it seems to make sense because of the way the wildcard can be used in conjunction with tab completion.

    Posted Wednesday, April 22, 2009 5:59 PM by timbenninghoff | 0 Comments
    Filed under: ,
  • SQL Quiz, WHAT?

    I was tagged in a blog post making the rounds based on the question, 'Who has been a great leader in your career and what made them a great leader?'

    I think I'll join the bandwagon on this way and politely decline on talking about great leaders in my career, except perhaps obtusely, and instead focus on what makes a great leader.  What makes a great leader?  In my mind, it's the following things:

     Advocacy

     A leader knows how to advocate for those he is leading, as well as advocating for the things that his followers truly believe are the right direction to go.

     A Good Listener

     No, this isn't a cheesy personal ad.  I think good leaders know how to listen to their subordinates and know when to rely on their technical expertise to help fill in holes in their own understanding of a situation.  Using this information, they are able to make good decisions on direction and set challenging, yet attainable goals.

    A Keen, Analytical Mind

    To me, a leader also has the ability to quickly identify a problem, either because they understand the inner workings of the technology, or the political situation, or because they've listened to their technical experts and know how to synthesize that information in order to make the best decisions.  Using this information, they figure out what needs to be done, they know how to direct the resources they have at their disposal, and, because they know their subordinates, they know what they need to do in order to inspire and motivate their employees.

    Trust

     This one is huge.  Leaders have to be trustworthy or everything else is pointless.  Don't do anything to lose trust.  It's really hard to get back.

     Connect on a Personal Level

     A good leader knows you.  They know what motivates you, they know your weaknesses, and they know how to get the best from you.  They know how to challenge you without overwhelming you.

     

    And, though I'm not tagging anyone further on this one, I PROMISE to tag at least one person next time one of these blog memes hits me.

  • Any Number You Want As Long As It's 1

    Any customer can have a car painted any colour that he wants so long as it is black. - Henry Ford in his autobiography "My Life and Work"

    SQL Server has a stored procedure for creating new SQL Agent jobs called , and this stored procedure has a parameter named @start_step_id.  However, when you try to create a new job with a start step other than 1, you get the following error:

    Msg 14266, Level 16, State 1, Procedure sp_verify_job, Line 75
    The specified '@start_step_id' is invalid (valid values are: 1).

    As the error indicates, and we can verify by digging into the code, sp_add_job also kicks off an undocumented stored procedure called sp_verify_job.  Looking at the code of sp_verify_job, we find this little nugget in the comments:

     For [new] MSX jobs we allow the start step to be other than 1 since
        --       the start step was validated when the job was created at the MSX

     Since I've never been anywhere that created multiserver environments, I'll have to take their word for it, although I did begin to suspect that if you created a job on a master server using sp_add_job you'd only be able to start at step 1, and that this parameter is really just so that the multiserver environment can use it to migrate jobs around.

     What's my main point?  Documentation is our friend.  I shouldn't have to dig into a nested stored procedure to find out that there is really no way that I, a humble user, can take advantage of the @start_step_id, and I should in turn not subject my fellow database professionals to this sort of information chase.

     

  • Object Limit And SQL PowerShell Provider?

    One of the applications we have from a vendor has a SQL Server 2005 database with over 50,000 tables.  I was using the PowerShell provider for SQL Server to search through the tables to find any that had the word 'ROLE' in it using the following command:

    get-childitem *ROLE*

    But, in the results, only two tables where returned, and I was expecting plenty more.  In T-SQL I ran the following query:

    SELECT count(*) FROM sys.tables
    WHERE [name] LIKE '%ROLE%';

     This time I see that there are 133 tables in the database that have the string 'ROLE' in the name.

    I recalled reading a blog somewhere about folks hitting an object limit when returning objects from Active Directory in PowerShell, so I begin to wonder if there isn't some sort of configuration inside the PowerShell provider for SQL Server that lets you set a limit for get-childitem.  And, sure enough, there is a global variable called $SqlServerMaximumChildItems.  However, mine is set to 0, which means I should be returning an unlimited number of objects.

    I head back into PowerShell in the SQLServer:\Sql\\\databases\\tables\, and run the following statement:

    (get-childitem *).count

     1000?  What if I change the SqlServerMaximumChildItems global variable to 20 and try the count again?

    set-variable -name SqlServerMaximumChildItems -scope global -value 20

    Nope, the directory count still shows 1000.  What if I bump up the variable to 2000?  Directory count still shows 1000.

     I try writing out the list of table names to a text file, and only 1000 table names are written.  Does PowerShell have a 1000 object limit?  That doesn't seem right, and my internet searches can't find anything to even suggest that.  In fact, I use invoke-SQLCMD to select the table names from sys.tables again and I get all 50,000 objects.  What about SMO 10.0?  That doesn't seem right either.  Something about the way I setup PowerShell to work with the SQL Server Provider?  Is there a bug in the SqlServerMaximumChildItems global variable?

    Anyway, I tested it on another machine, and was able to reproduce the issue.

     I'm stumped.

    Posted Wednesday, April 08, 2009 1:35 PM by timbenninghoff | 1 Comments
    Filed under: ,
  • My Plug for SQLPSX

    Chad Miller, one of my favorite bloggers on SQL Server and PowerShell, has just announced the release of version 1.5 of SQLPSX.  SQLPSX is a collection of functions, cmdlets, and scripts that make accessing SQL Server via PowerShell extremely easy.

    For example, normally, without even considering error handling, you'd have to write the following PowerShell code to get a SQL Server server object:

    $srv = new-object ("Microsoft.SqlServer.Management.Smo.Server") "YourServer"

    But with SQLPSX, you get that functionality, plus error handling, in the easy-to-use, easy-to-remember function, Get-SQLServer:

    $srv = Get-SQLServer "YourServer"

    I use only a fraction of what SQLPSX has to offer, but functions like Get-SQLServer save me so much typing and time I've modified my PowerShell profile to load the LibrarySMO.ps1 script whenever I startup PowerShell.

    I also recommend checking out Chad's great articles on PowerShell and SQL Server on SQLServerCentral.com, such as getting data out of SQL Server using PowerShell and loading data into SQL Server using PowerShell.

    Posted Saturday, March 28, 2009 12:14 PM by timbenninghoff | 0 Comments
    Filed under: ,
  • Thoughts On The Cloud

    Cloud computing is all the rage right now.  Many of the people I follow on Twitter, whose opinions I respect, are talking about it as the wave of the future.  Paul Nielsen made a bold prediction on his blog last monthBrent Ozar blogs about it frequently and appears pretty excited about it.  And, without a doubt, the news today about SDS is adding momentum to cloud computing.

     

    So, anyway, here I sit.  I'm in danger of slipping behind on my goals for 2009 to blog once a week.  I need a topic.  Why not cloud computing?

     

    I'm a DBA.  I'm in the IT trenches every day, and my job boils down to making sure that people have access to their data, and that their data is protected; from theft, from disaster, from accidents, from their own mistakes.  I work hard during the day, and I have enough interest in my craft to dedicate some of my personal time to it.  Here's cloud computing through my lenses.  You’ll need your own grains of salt.

     

     When You're A Hammer...

     

    Despite my best efforts, I’m not the creature of logic I strive to be.  And this is another one of those times when it’s very possible that my emotions are getting the better of me.  So, let me get the emotional part of my critique of cloud computing out of the way.  My instincts say it is not even going to come close to approaching the bold pronouncements of its major proponents, be they vendors or technology experts.  Why?  I’d counter that question with some questions for you.  How’s SOA working out for you?  How about offshoring?  Virtualization?

     

    These are just a few examples of recent IT ideas that, while they looked great on paper, have in no way been universal successes.  I’m not suggesting that because they are not universal successes that they must be failures, but what I mean to say is that they have their time and place.  Clearly, your perspective has alot of influence on whether you consider them successes, and some of them have been more successful than others.  But, they are another tool in your toolbox for solving problems.

     

    I see cloud computing as just another tool in the toolbox.  It will have some impact on the IT topography of companies, but it will not dominate the landscape for probably a couple decades.

     

    So, where’s my proof?  I don’t have any.  It’s just my gut talking.  It’s my brain interpreting what I perceive as the loud volume of vendors advocating for the cloud, and the silence of customers, or potential customers, praising it as a success.  It’s my experience dealing with vendors and imagining that I will have the same sorts of problems dealing with cloud computing vendors that I do with hardware and software vendors today.  It’s my natural defense against hype.  It's all of those things, and probably more.

     

    I Can See The Future?

     

    It’s not fair for me to just play armchair critic and not risk a little something, so here are my predictions.

     

    Over the next 18 months there will be a major push towards cloud computing in the IT press.  (C’mon, that’s a gimme.)

     

    In 5 years, large companies will start pulling back from their initial, heavy investments in cloud computing for numerous reasons.  There will have been a high profile security breach.  A major cloud computing vendor will have gone belly up, likely due to some scandal, putting its customers at risk.

     

    In 20-30 years my post will seem foolish, as cloud computing will be a common, natural, but still not exclusive, part of the IT landscape.  Why 20-30 years?  Because I think it’s going to take a cultural shift of kids growing up in a time of ubiquitous internet and the concept of non-tangible goods, like digital music files, to take leadership positions and express their unique view of IT in their companies’ decisions.

     

    That’s Just A Start

     

    I think I’ve got a lot more to say about cloud computing, but I’ve used up my blogging minutes for today.

    Posted Tuesday, March 10, 2009 5:10 PM by timbenninghoff | 1 Comments
    Filed under:
  • Beware Moving Parts

    My faithful readers will remember one of my orginal posts in which I expressed my enthusiasm for a new phone I was planning on purchasing, the Pantech Matrix.  Well, tragedy struck today when, for no apparent reason, the slider keyboard no longer functions.  If I extend the keyboard slowly, I see the screen image reorient to the keyboard and the keyboard lights go on.  But, when the keyboard is halfway extended, the keyboard lights suddenly turn off and the screen reorients back to the phone orientation.  I've seen this described online as having the screen switch to landscape mode when the keyboard is extended.

     I called the AT&T store at which I bought the phone, and since the phone is still under warranty, they gave me a phone number to call for warranty service.

    On the plus side, the phone support person was very professional and walked me through a couple steps to see if a reset of the phone back to its factory settings would solve the issue.  I didn't have much hope, since it feels like a mechanical issue to me, but I went along with the testing.  Sadly, it didn't fix the problem.

    On the plus side, they're going to send me a replacement phone which I should receive in 5 days.  On the down side, the phone is a refurbished one, not new, and my warranty does NOT reset to the date on which I receive the new phone.  I suppose I can live with the refurb, but the lack of a warranty reset doesn't exactly instill me with confidence.

    I will give the Pantech Matrix this credit; being deprived of the keyboard I've had to rely on the predictive edit mode, which is actually pretty slick and lets me type on the number pad almost as quickly as I would on the keyboard...almost.

    And there's the dark side to being an early adopter of new technology.  Sometimes you're the guinea pig on your own dime.

    Posted Tuesday, February 24, 2009 6:36 PM by timbenninghoff | 0 Comments
    Filed under:
  • Expandable!

    When I start to learn a new technology, I find I learn best if I jump right in, get my hands dirty, and play around with it for a while.  After a couple initial ah-HA moments and early successes, I inevitably hit a wall where I can't seem to solve a particular problem.  By that time, I've built up quite a list of questions, and I'm ready for a deeper understanding of what's really going on.

    At that point it's time for me to pick up a tech book and read.  And that's when I get a couple more ah-HA moments almost right away.  Case in point:  I've been working with PowerShell and SQL Server on a regular basis now for a couple months, and I've written a couple of scripts based on some tutorials I've found online and some excellent articles and blog posts people have written.  I'd hit that wall, and so based on the recommendations I'd read on a couple blogs, I picked up Bruce Payette's Windows PowerShell In Action. It's turning out to be an excellent read.

    In a previous blog post I had written some code that had this little gem in it:

    Get-Content $tblfile | ForEach {$trdir=$fulldir+"\dbo."+$_+"\triggers\"; cd $trdir; dir | ForEach{$_.Script()>> $outfile}}

    Experienced PowerShell users can probably spot my code silliness right away.  As Bruce Payette's book explains, variables are expanded when placed within double quotes.  So, in the case above, assigning the string to $trdir is an extra, unnecessary step.  Knowing this, I could have saved my precious fingers for my accordion playing by dropping the $trdir assignment:

     Get-Content $tblfile | ForEach {cd "$fulldir\dbo.$_\triggers\"; dir | ForEach{$_.Script()>> $outfile}}

    Were you to know that I was a DBA who primarily worked with T-SQL, this mistake would make sense.  T-SQL doesn't have something as nifty as expandable variables within quotes...but that doesn't make me love it any less.  But, you're a persistent reader, you went back and read my previous blog post, and you're quick to point out 'Tim!  You used double quotes to expand variables earlier in that same script!!!'  Would you believe that sometimes computers speak to me and I just KNOW what I should be doing without understanding it?  No?  You, reader, are much to clever and should be writing your own blog.  I promise to read it.

    Posted Wednesday, February 18, 2009 6:46 PM by timbenninghoff | 0 Comments
    Filed under: ,
  • Things You Know Now...

    Jeremiah Peschka tagged me in recent blog post, asking me about things I know now, not necessarily SQL related, that I wish I'd known when I started my 'professional career'.  Here's my stab at it:

    Almost Nobody Knows What They Want To Be When They Grow Up

    I'll admit it.  I was jealous of everyone who seemed to have 'a plan' for what they were going to study in college or what they were going to do for a job, or even better, a career, once they got out of college.  Life, in a way, seemed easy for them because they had a clear path of what they needed to do to succeed.  I started out in Zoology, earned a degree in Entomology, and ended up doing temporary office/clerical work for 4 years because I 'had no idea what I wanted to do with my life'.  And suddenly, a string of events put me on the path to becoming a DBA.  Yeah, yeah...what's my point?  Try stuff!  If nothing else, telling the convoluted story about how you got into your career will be an interesting story at parties.

    Take The Long View

    If I could go back in time and talk to myself in the past, I don't think there is anyway that I could properly convey that every little bit does indeed count.  But, I wish I could.  Those 10 minutes you spent reading an article about the physical allocation of pages in SQL Server?  That forum thread on a particular DTC error?  That day you just couldn't leave work without knowing the answer?  Those things are all going to add up.  And one day, you're going to wake up and think 'Holy Crap, I really DO know things!' and people are going to actually agree with you.  And, with luck, that thought we'll be followed by 'Holy Crap, there's still so much I don't know!'

    There Are No DBAs, Only Zuul

    OK, that title didn't make sense, but who doesn't like a good Ghostbusters reference?  What I mean is, there is no magic path to becoming a DBA.  There's no DBA school, there's no authoritative list of prerequisites that suddenly confer on someone the title DBA.  Hardly anyone hires a DBA with no experience, and everyone is looking for that sacred 5 years of experience.  But how do you get that experience if you never hold the title DBA?  It's a MYSTERY!  It's both a blessing and a curse.  It means that you'll know you want to be a DBA, but you won't know how to get there other than 'do stuff with databases'.  Oh sure, you'll read some fancy blog posts about 'how to become a DBA', but you'll inevitably end up frustrated because it's virtually impossible to translate their advice to your life.  So you keep slogging thorugh, 'doing stuff with databases', until one day you get a job where someone ACTUALLY calls you a DBA, and you realize that having the title is meaningless anyway.

     DBAs Are Technologically Advanced Enough To Be Considered Magic

    It's true.  And that's why we are essential to the success of any business.  Businesses ignore this at their own peril.

     

     

     

    Posted Tuesday, February 10, 2009 6:39 PM by timbenninghoff | 1 Comments
  • Semantically Equivalent?

    I was helping some developers with their upgrade of an ISV application, when we hit a snag.  It seems that our installation of the application had columns defined as binary(8) when the standard data structure called for timestamp.  I seemed to recall that timestamp held binary data, but I could not remember the details.  So, I turned to Books Online and re-read the definition of the .

     And that's where I saw it...the phrase 'semantically equivalent'.  As BOL stated 'A nonnullable timestamp column is semantically equivalent to a binary(8) column.'  I thought I knew what 'semantically equivalent' meant, but since I was pretty sure I couldn't explain it to someone else clearly, I looked it up to make sure.  But, I just couldn't seem to agree that nonnullable timestamp and binary(8) are semantically equivalent.  If I had a binary(8) column, and I put a value in it, would that necessarily have the same meaning if I put the exact same value in a timestamp column?  Or, does the fact that SQL Server automatically updates the timestamp column on an insert or update make the 2 sufficiently different?  In other words, does the metadata about timestamp make binary(8) and timestamp different?  Or, am I, forgive the pun, arguing semantics?

    These are the things that keep me up at night...well, that and contemplating the total body count in the first season of True Blood.

    Posted Thursday, February 05, 2009 6:07 PM by timbenninghoff | 0 Comments
    Filed under:
  • 2009 Professional Resolutions

    A month or so ago I was tagged in a blog post by SQL Batman to share my professional resolutions for this year, 2009.  It might seem a little late to be thinking about this sort of thing, but honestly, there is a method to my madness.  I like to use the entire month of January to review what goals I made the previous year, make tentative goals for the new year, and give those resolutions a test run for a month.  After that month, I've got a pretty good idea what's feasible, and I can define a bonus goal that requires me to stretch to reach.  So, cutting the chatter.....

     Certification:   For my own personal gratification I'd like to get my MCITP: DBA this year.  I need both 70-443 and 70-444  Bonus goal:  Take a third exam.

     Blogging:  Make a blog post at least once a week.  Bonus goal:  More than once a week.

    Reading:  Read a technical book on SQL Server or PowerShell once every two months.  Bonus goal:  Once a month.

    Technology:  I've chosen PowerShell as the new technology that I'd like to become fluent in.  This is a difficult goal because it doesn't really have a clear definition of when the goal is reached.  It's more like a technological mission statement.  Bonus goal:  SQL Server Integration Services

    Scripting:  Write or find 1 script or query a week to add to my 'toolbox' that assists me in my administration tasks.  Bonus goal:  Two or more per week.

    Learning:  Attend my local PASS user group meeting at least once this year.  This one is always tricky since the nearest group is an hour or more away and starts at the unholy hour of 4:30 PM.  Also, I must attend the PASS summit again this year.  It is imperative.  Bonus goal:  Only The Shadow knows.

     Since this blog theme is well past its freshness date, I'll let it rest.

    Posted Tuesday, January 27, 2009 6:28 PM by timbenninghoff | 0 Comments
  • Using PowerShell To Script Out Table Triggers In A Database

    A request came in from one of the developers asking if there was a way to script out all of the triggers for a list of tables in a database on SQL Server 2005.  This database has around 60k tables, and navigating to each of the 40 some tables that the developer needed triggers for in SQL Server Management Studio (SSMS) was becoming quite a chore.

     My first instinct was to turn to the 3rd party database management tool to see if they could handle it, but I wasn't holding my breath.  They've been having serious quality issues lately, and I've been getting bitten regularly by bugs in their software.  Well, it came as no shock that, though the tool does have the functionality, it's buggy, and didn't give me what I needed.

     So, I turned to PowerShell, exercised my novice PowerShell skills, and came up with this little script:

     $srvr = "Server01"
    $inst = "Default"
    $db = "MyDatabase"
    $tblfile = ""
    $fulldir = "SQLSERVER:\SQL\$srvr\$inst\Databases\$db\Tables"
    $outfile = ""

    Get-Content $tblfile | ForEach {$trdir=$fulldir+"\dbo."+$_+"\triggers\"; cd $trdir; dir | ForEach{$_.Script()>> $outfile}}

      I wrote the script for Powershell 1.0.  You'll also need to have the SQL Server 2008 Management Objects installed, either by having SQL Server 2008 installed, or by following the instructions I listed in a previous blog post.

    The $tblfile is really nothing more than a text document with a list of table names like:

    Table1
    Table2
    etc.
    Posted Thursday, January 22, 2009 9:43 AM by timbenninghoff | 1 Comments
    Filed under: ,
  • Problem Creating Virtual Directory For Named Instance in Reporting Services

    I know my faithful readers are waiting on pins and needles for me to post my goals for 2009.  I promise they are forthcoming, but I prefer to take my wife's advice and use the entire month of January to review the previous year, define goals for the new year, and give those goals a trial run to see if they are reasonable, or outrageous.

     Anyway, as I work through that process, I thought I'd just do a repeat of an issue I found while administering SQL Server 2005 Reporting Services that I originally posted in the forums over at SQL Server Central.

     When you install SQL Server 2005 Reporting Services as part of a named instance install and, as part of the installation, choose to allow Reporting Services to automatically be configured for you, a couple virtual directories are created in IIS by the installer using the name format Reports$InstanceName.  As I was migrating SQL Server to a new server, I did a clean install of SQL Server 2005, and this time I did not have the SQL Server installer setup Reporting Services for me.  I went to the Reporting Services Configuration Manager, and I tried to setup the Report Server Virtual Directory and Report Manager Virtual Directory to use the name format ReportServer$Instance format again, but I receive the following error:

    ReportServicesConfigUI.WMIProvider.WMIProviderException: The virtual directory specified is not valid. Make sure the specified name is not too long and doesn't contain illegal characters. (example: ;) at ReportServicesConfigUI.WMIProvider.RSReportServerAdmin.CreateVirtualDirectory(String virtualDirectory, String path)

    I assumed this was a result of the use of the dollar sign ($) in the directory name. I thought perhaps it might just have been a bug with the SP1 Reporting Services Configuration Manager that I had installed at the time, but I also tried connecting with an SP2 Reporting Services Configuration Manager to my SP1 Reporting Services, and I still got the same error.

    The fix for this issue involves manual registry changes and manually modifying .config files, so beware, AND I've only given it the briefest of testing (deploying a report from BI Studio, running a report from Report Manager, etc.).  As of this updated writing, though, this configuration has been in our test environment for about a month, the instance has been patched to SP3, and I haven't had any problems.

    In my instructions below I use ( ) to indicate variables that you should replace with your own values, minus the (), of course.

    How to preserve $(InstanceName) in Virtual Directory name:

    1)  Via IIS on the existing Reporting Services installation, save the Reports$(InstanceName) and ReportServer$(InstanceName)  virtual directories configurations to a file.

    Through the IIS Manager, right click on the virtual directory, All Tasks>Save Configuration to file...
    Note:  I used .xml as the extension for this file
    Note:  Do this for both virtual directories

    2)  Migrate files to the new server, which already has Reporting Services installed.  In my case I already had Reporting Services configured to use virtual directories that didn't have $(InstanceName), though this isn't necessary.

    3) Import the IIS virtual directory configurations to IIS on the new server

    From IIS Manager on the new server, right click on your Default Website and select New>Virtual Directory (from file)...
    Use the Browse button to locate your .xml files from step 2, and then click Read File
    Select the Virtual Directory name that appears in the white box and click the OK button.  You should see your new Virtual Directory appear with the name Reports$(InstanceName) and ReportServer$(InstanceName)

    NOTE:  Because I had changed the MSSQL.# between servers, I had to modify the file path in the virtual directory's properties in order to point to the right folder.

    4)  Stop Reporting Services

    5)  Modify the following registry settings under HKLM\Software\Microsoft\Microsoft SQL Server\MSSQL.(instance#)\Setup\:
            RSVirtualRootApplication - Reports$(InstanceName)
            RSVirtualRootApplicationPath - IIS://(ServerName)/W3SVC/1/Root/Reports$(InstanceName)
            RSVirtualRootServer - ReportServer$(InstanceName)
            RSVirtualRootServerPath - IIS://(ServerName)/W3SVC/1/Root/ReportServer$(InstanceName)

    Also, you may need to add the $(InstanceName) to the following keys if they don't already exist:
           ApplicationPoolConfigured_RM
           ApplicationPoolConfigured_RS
    In my particular case, these changed automatically for me, so I didn't change them manually.

    NOTE:  (instance #) will of course be the instance number of your Reporting Services installation...like MSSQL.2, etc.


    6)  Modify the following .config files for Reporting Services:
    1) In C:\Program Files\Microsoft SQL Server\MSSQL.(instance#)\Reporting Services\ReportManager\, modify RSWebApplication.config so that the ReportServerVirtualDirectory node contains ReportServer$(InstanceName)
    2)  In C:\Program Files\Microsoft SQL Server\MSSQL.(instance #)\Reporting Services\ReportServer\, modify rsreportserver.config so that the UrlRoot node contains http://(ServerName)/reportserver$(Instance Name)

    7)Delete the old virtual directories, Reports and ReportServer, from IIS, if present
    NOTE:  Like I mentioned earlier, they were for me because I had setup Reporting Services without the $InstanceName ahead of time, so I had to delete them.  If you haven't setup RS yet, you should be able to skip this step.

    8)  Restart Reporting Services
    NOTE:  If you're paranoid, it might not be a bad idea to restart IIS before you restart Reporting Services.  I'm paranoid, so I restarted IIS as well.

    9)Via Reporting Services Configuration Manager, verify that the Report Server and Report Manager Virtual Directories contain the new values

    10)  TEST!

     And that's it!

    Posted Thursday, January 15, 2009 7:13 AM by timbenninghoff | 0 Comments
More Posts Next page »
Powered by Community Server (Personal Edition), by Telligent Systems