SSIS (RSS)

SQL Server Integration Services

Attending the PASS Summit

Steve Jones makes some good points in his blog post Training. I find it difficult to believe the short-sightedness of some organizations when it comes to training events like the PASS Summit.

This year's Summit - like all previous years to date - had enough top notch presentations and labs to make it worth the cost of admission, travel and expenses, and the cost of allowing a database professional to leave work for three days combined. More than enough.

Like Steve, I don't get it.

Also like Steve, I bet we'll see these DBAs at the 2008 PASS Summit in Seattle - and working for another company.

I wonder if those responsible for denying database professionals opportunities for training factor in the cost of hiring and training a new DBA every six to eighteen months?

:{> Andy

Technorati Tags: EMPs Database Professionals PASS Training Changing Jobs

At the 2007 PASS Summit!

PASS stuff!

I'm at the 2007 PASS Summit!

I was wandering around the Colorado Convention Center earlier today in shorts, unshaven, tired... looking like I'd been rode hard and put up wet. But I caught the last half of Gert Draper's excellent presentation on Team Edition for Database Professionals! Gert is the man.

I'm presenting on the same topic tomorrow (yeah, I know - great move there, Andy...) and then on SSIS Development practices Thursday. This promises to be the geekiest week I've had in a long time!

:{> Andy

Technorati Tags: PASS Summit 2007 Denver Team Edition for Database Professionals

Getting Ready For The PASS Summit!

The PASS Summit is less than two weeks away!

I'm getting ready for my presentations. I need a couple laptops to host virtual servers for the demos, so I bought some new gear to take with me.

Check out my Network-In-A-Bag!

Network in a bag!

It's a power strip, a couple CAT6 cables, power supply, and a NetGear 1G 5-port switch - all in a 1 gallon Ziploc bag.

:{> Andy

Technorati Tags: PASS Summit 2007 Networking

Iteration = Maturity

Introduction 

I was recently reminded that iteration matures software.

The History of Andy, Part 1 

Like many DBAs, I was a software developer in another life. I built web applications - working my way up from HTML through DHTML and finally to ASP - and could regale (and bore) you young whipper-snappers with war-stories of how things were "back in my day". [/DanaCarvey]

But I won't.

The Times They Are a-Changin'

I'll share instead something I've witnessed many times since starting with software in 1975 - and something you probably already know: stuff changes.

And thank goodness stuff changes!

I recently ordered 1G of RAM from an online retailer. It should arrive before my next son (but that's not a given as Riley refuses to provide a tracking number - the doctors will induce Christy into labor Friday if he hasn't been born by then - but I digress...). I remember my neighbor John, who introduced me to computers, purchased a 256-byte RAM chip in the mid-1970s for about what I paid for the 1G. That's 256 bytes of RAM - not a typo. As I recall it was either a 14- or 16-pin IC.

Things have changed since then. Improvements in technology, brought about by building and improving upon existing knowledge, have brought us to a day when I can purchase 1,073,741,824 bytes for roughly the previous price of 256. I don't know how you feel about that. I think it's a good thing.

The idea of "building and improving upon existing knowledge" defines iterative development. Although the idea is relatively new to the software development field, it serves as the basis for engineering disciplines. Engineers iterate - build and improve upon existing knowledge - and we get more powerful hardware for the same amount of money. What's not to like?

Iteration - it's not just a good idea... 

Iterative software development builds and improves upon existing knowledge within a specific domain. Most domains are defined by an application (wholly or in part), enterprise knowledge (again, wholly or in part), or - most likely - some combination of the two. For example, let's say you work for a large corporation as a software developer. Your domain could be the corporate website. In which case you possess knowledge about the business of the corporation and web development. You mix these together to do your job. In this case, you will probably pick up marketing savvy and current trends along with the latest AJAX techniques.

As you make successive passes (iterations) through the website design interacting with marketing, your domain knowledge is built and improves. As your domain knowledge increases, the website will become more valuable to the corporation - as will you.

Iteration adds value.

Got Iteration?

The same can be said for database development.

Perhaps you've experienced this in your own database development efforts: you receive a request for a database design to meet some desired functionality. Or you're handed a design and asked to optimize it. Or maybe even you had an idea to capture data - performance metrics or something similar - and you're designing a database solution to accomplish this.

You get into the development a few hours or a few days and realize a little tweak here or there would improve performance, or readibility, or better adapt the design to your intentions. So you make the tweak and continue.

This improvement leads you to re-examine other portions of the design and you make more tweaks. Maybe your last change broke things. Maybe you see an opportunity to add a parameter to a stored procedure and combine the business logic of three stored procedures into one.

A "Growing" Solution 

Pretty soon, you have iterated enough to feel comfortable promoting, integrating, or even releasing the results - letting the effort move to the next step.

Depending on the nature of your efforts, it may not end there. If your database development is the back end of a larger application - say, the corporate website, for example - there will likely be requests for changes over time as the site grows (scales) in complexity and size.

When the requests come in you are not likely to start over. You will most likely build and improve upon your existing knowledge. You will most likely iterate.

Scaling forces iteration.

Voilà

This is how solutions mature - be they applications, databases, or both - regardless of who writes them or how many are involved in the development effort. It doesn't matter if the development team is one lady in a cubicle in the European Union or a development team of thousands at Microsoft.

Iteration matures software.

:{> Andy

SSIS for DBAs

Solid Quality Mentors is now offering a course entitled SQL Server 2005 Integration Services for Database Professionals.

I was honored to work with Erik Veerman to develop this SSIS for DBAs course! Our goal is to provide training in SSIS's non-ETL capabilities - stuff that is useful for Database Administrators.

Although it is an amazing ETL engine, SSIS is more than an enterprise ETL platform - it has lots of capabilities built especially to assist DBAs in the Production environment. In the course, we introduce the SSIS development environment and SSIS to folks who may have never opened Business Intelligence Development Studio.

Next, we cover the basics of SSIS - walking through capabailities of the Control Flow, Data Flow, Event Handlers, Variables, Properties, and Expressions.

We then get busy with Transfer Tasks, Import and Export Wizards, and Maintenance Plan development. Our next section focuses on using SSIS to perform more advanced DBA tasks - building SSIS packages that automate FTP, flat file data extraction and loading, and file archiving operations. We cover optimization, data cleansing, text mining, and binary data. We also dive into WMI, File watching, and Notifications - there's even an optional section on migrating DTS!

We then address SSIS administration, deployment, security, package configurations, logging, modular package design and team development. We cover restartability, snapshots, and transactions, and conclude with a section dedicated to troubleshooting.

Material is presented, then demonstrated. Students then perform lab exercises to reinforce the presented and deomnstrated concepts. It's a great way to learn - especially in a rich visual environment like Business Intelligence Development Studio.

As if all that wasn't enough, the course is led by Solid Quality Mentors! These are folks who have been there and done that - database professionals that are published, MVPs, or both!

If you're interested, contact at Solid Quality Mentors.

:{> Andy

Technorati Tags: SSIS DBA SSIS for DBAs Database Professionals Solid Quality

Meltdown!

A couple days ago the Vista Ultimate instance on my laptop when all Klingon on me: it was "a good day to die."

I'll never know why for sure. Indications point to COM+ and VMM giving up the ghost. They were good systems, may they rest in peace.

I tried to revive the old OS. It would run in Safe Mode and even Safe Mode with Networking, but that's just not the same as having all the functionality I know and love.

I've spent the last couple days (and nights) rebuilding a second instance of Vista Ultimate on the same machine. I'm about half done at this point. Today is SQL Server instances and Visual Studio Team System day.

It's not just the installs, there's the service packs and updates after the installations. Lots of installing.

But it's also an opportunity to rebuild the machine with a different configuration.

Back to installing...

:{> Andy

Technorati Tags: Vista Re-install

Raleigh Code Camp - tomorrow!

I'm looking forward to speaking at the Raleigh Code Camp tomorrow!

I'll be presenting on Team Edition for Database Professionals and Incremental Loads with SSIS. It going to be fun!

If you're atending tomorrow and read this blog, please introduce yourself!

:{> Andy

Technorati Tags:

Back in the Saddle...

Monday, after a five-month hiatus, I return to SSIS training.

I really enjoy training whether I'm attending or leading it. It killed me last week to miss both Tech Ed and the Richmond .Net Users Group June meeting, but a client needed me on-site and I try to never leave a client hanging.

I roll out around 6:00 AM tomorrow to head for the airport, and it's 1:40 AM as I type this - another day when it's good to not need sleep! :)

To the good folks at New Horizons in Edina, MN: I'll see you Monday!

:{> Andy

Technorati Tags: SSIS training Minneapolis MN

SQL Server 2008 CTP Available!

The June CTP of SQL Server 2008 (the database server formerly known as Katmai) at !

Registration is required. You must also complete a three-question survey.

:{> Andy

Technorati Tags: SQL Server 2008 Katmai CTP

Managing The Thing You Cannot Touch

Yesterday I wrote about The Thing You Cannot Touch. Today I'm going to tell you some ways to manage the situation.

First, try to determine why You Cannot Touch The Thing. This is invaluable information in charting the waters ahead - especially if you're consulting.

Second, accept the fact that there's better than a 90% chance that you will not, in fact, be allowed to Touch The Thing. In my experience, three things must be true for you to overcome the business friction imposed by The Thing:

  • You have to try everything else first.
  • Everything else must fail to sufficiently address the issue.
  • The source of the issue must be mission-critical.

Regardless, your best knee-jerk reaction is acceptance. This is tough for a professional. In your heart of hearts you know what it takes to solve the real issue. And yet, you've been told You Cannot Touch It.

The good news? There's also a better than 90% chance you can find a way to solve the issue - or at least alleviate the client's pain - without Touching The Thing.

Modern enterprise applications are comprised of lots of moving parts. The Thing is probably not the sole source of pain. Addressing other bottlenecks may do the trick - at least for now.

And, if you're the person they called last time they had an issue and you solved it (and weren't "difficult" to work with), you'll likely get the call next time.

How cool is that?

:{> Andy

Technorati Tags: Consulting Software Development Satisfying The Customer Leveraging New Business

The Thing You Cannot Touch

I have this theory about consulting. I call it The Thing You Cannot Touch. Since a few friends have found it amusing I thought I'd share. It goes like this:

A potential client contacts your firm. A conference call is arranged to discuss the issue. During the call, the issue is defined. Resolution theories and attempts to date are shared, along with their results. The current status is explained - along with

The Thing You Cannot Touch.

Sometimes an attempt at justification accompanies the announcement: "We know it can't possibly be _______ so we're not going to waste any time looking at it."

Other times, it's just put out there for what it is: "You can't touch _______."

My experience has shown the heart of the issue almost always lies with The Thing You Cannot Touch. It needs to be fixed but someone, somewhere, for some reason does not believe it to be so - and so it Cannot Be Touched.

Sometimes it's political - It's someone's "baby". They built this application just ten short years ago - worked nights and weekends and toiled and sweated and bled to make it work - and rode it all the way to CIO, after all. Who are you, lowly consultant, to tell them VB 6 code should be re-written in this new fad known as .Net? Doesn't Vista support VB 6 until the mid-20-teens?

Sometimes the decision-maker doesn't understand the differences in the technologies.

Sometimes it's a purely market-driven business decision - and the decision-maker is right and justified in choosing to keep hands off The Thing. It's not all about technology folks... it's sometimes about what I like to describe as the (little "s") software (big "B") Business.

If you find yourself on a consulting conference call and The Thing You Cannot Touch comes up, pay attention. Tomorrow I tell you how to Manage The Thing You Cannot Touch.

:{> Andy

Technorati Tags: Consulting Software Development Thing You Cannot Touch Old Code Outdated Code VB 6

SSIS Design Pattern - Incremental Loads Post Now Live at ABI

The latest in my SSIS Design Patterns series - SSIS Design Pattern - Incremental Loads - is now live at Applied Business Intelligence!

:{> Andy

Technorati Tags: SSIS Design Patterns Incremental Load Business Intelligence ETL BI

Updating the Virtual Team Foundation Server document

Talk about overdue... I'm finally updating the Virtual Team Foundation Server documentation at VSTeamSystemCentral.com.

There are 163 pages out there now that demonstrate the procedure I use to build a vTFS but they were written for the December 2005 CTP of TFS. A couple things changed - not much mind you: I think the biggest change is less permissions (not a member of local administrators) for the TFSReports and TFSService accounts. This means the document is ok as far as functionality is concerned, but it's always best to follow the principle of least privilege.

I'm finishing up and a project in the next couple weeks. Adding some content to my blogs and VSTeamSystemCentral.com is one priority. I have about 35 blog posts in my \Andy\Blogs\Primordial folder just waiting to be posted - most of them at Applied Business Intelligence in a series I'm calling SSIS Design Patterns.

I really enjoy having too much work to do - I like it a lot more than the alternative. But I'm also looking forward to doing more work around the house as Wee's birth approaches.

:{> Andy

Technorati Tags: Team System Team Foundation Server SSIS Design Patterns Business Intelligence Wee Leonard

SSIS Design Patterns Series

On Applied Business Intelligence I've started a new series called SSIS Design Patterns.

The first in this series is now posted: Dynamic SQL.

:{> Andy

Technorati Tags: SSIS Dynamic SQL Design Patterns

A Green Box

It's been way too long since I posted a blog post - my apologies! I've started a bunch of posts (mainly so I wouldn't forget the idea on the top of my noggin), but nothing complete enough to turn into the wild.

I've been swamped on a consulting gig. It's a good gig - just intense. Since I started doing lots of SSIS training for Solid Quality, I had this idea it would be good to do a few weeks of consulting every now and then - just to stay sharp.

After battling a particularly "interesting" requirement, I finally got this:

Most of you reading this are developers. You know why I'm posting - it's the 11th hour of a 12-hour day. And yesterday was a 12-hour day too. But tomorrow is a really long day, but it's travel-home day and Stevie Ray's P.E. and Show-and-Tell day. His teacher tells us that for the past six weeks, Stevie's Show-and-Tell item has been the same: "Dada's coming home tonight!" Heart-warming and -breaking, all at the same time.

So this is a celebration. Houston, we have a green box.

:{> Andy

How do you say "SSIS" in Mandarin?

This is cool - Professional SQL Server 2005 Integration Services has been translated into Mandarin!

:{> Andy

Technorati Tags: SSIS Integration Services Mandarin Chinese

I will present SQL Server Tips and Tricks 8 Feb 2007

After the Freezing Redneck Tour 2007 wraps up this week, I'm looking forward to the next meeting of the Richmond SQL Server Users Group.

I've decided to present on SQL Server Tips and Tricks. I have three topics in mind:

  • SqlClr
  • An Introduction to Incremental Data Warehouse ETL with SSIS
  • A surprise topic!

Trust me, if you work with SQL Server data warehouse ETL you will like the surprise. ;)

Hope to see you there!

:{> Andy

Technorati Tags: Developer Community Richmond SQL Server Users Group SqlClr Incremental ETL

A couple changes!

I've started a new blog: Applied Business Intelligence!

I will continue to blog here about Team System topics. I'll probably continue to share personal stuff here as well - I'm debating that...

I've also changed my login here at VSTeamSystemCentral.com. I'll no longer be posting as that dry and boring "admin" guy - now I'll be posting as me!

:{> Andy

Technorati Tags: New blog Applied Business Intelligence BI SSIS ETL Reporting Services

posted Tuesday, January 16, 2007 10:45 AM by andy with 0 Comments

The Freezing Redneck Tour - 2007 Rolls On!

The Freezing Redneck Tour - 2007 continues. In Week 2, I'm off to Minneapolis for an SSIS training course.

Since I started traveling regulary, I've been checking out different airlines - just to see what the differences are. I don't have enough data to make any determinations yet, but I see an interesting trend: a lot of the major airlines subcontract to smaller carriers. I see it with a flight advertised with Delta, for instance, but with a sub-heading that says something like "Operated by Fly-Me-Please Airlines."

It looks like the boarding for my connector is beginning - on time this time! :)

:{> Andy

Technorati Tags: Solid Quality Learning SSIS Training Minneapolis

SSIS Raw File Reader Review

As promised in an earlier post, I've tested the SSIS Raw File Reader released by Simon Sabin this past week.

Here's how I tested it:

First, I start a new SSIS project. I rename the package RawFileGen.dtsx. I add a Data Flow Task to the Control Flow as shown:

I double-click the Data Flow Task to edit it and add an OLE DB Source. I double-click the OLE DB source to edit it and aim it at the AdventureWorks.HumanResources.Employee as shown:

I add a Raw File Destination and configure it to dump the results to a file as shown:

I check the Input Columns to see if all are selected - I want a good test:

I click OK and press the F5 key to execute the package in Debug Mode:

I open the file generated in NotePad - not very legible:

I fire up the SSIS Raw File Reader:

I have to click File|Open and navigate to the file to view the columns:

After that, clicking Tools|Read File loads the data:

And this is a lot easier to read than the Notepad version!

I like it. Good work, Simon!

:{> Andy

Technorati Tags: SSIS Raw File Raw File Reader

Read Variable Value From a Flat File In SSIS

I recently saw a post on MSDN forums about loading a date value from a flat file into an SSIS variable. I thought I'd document one way to do it - I don't claim this is the best way, but it is one way that works.




First, I created a file named C:\FileDate.txt Download the File! and put the date 1/12/2007 in it.

Next, I created a new SSIS package and renamed it VarFromFile.dtsx Download the File!. I dragged a Data Flow task onto the Control Flow.

Double-click the Data Flow task to edit it. Drag a Flat File Source onto the Data Flow:

Double-click the Flat File Source to edit it. Click the New button to create a new Flat File Connection Manager:

I gave the Flat File Connection Manager the name FileDate, added a Description, and set the File name to C:\FileDate.txt:

Next, click the Columns item on the left to view the Row and Column delimiters and contents of the file:

Click OK to close the Connection Manager editor. The FileDate Flat File Connection Manager we just created is visible in the Flat File Connection Manager combobox:

Click OK to close the Flat File Source editor. Click the Control Flow tab. Right-click in the white space of the Control Flow tab and click Variables from the menu:

Create a new variable by clicking the New Variable icon (upper left). Name it vdtFileDate and configure the properties as shown:


Note: You cannot change the scope of the variable in the Variables window. Scope is controlled by the object that had focus when you created the variable - in this case, it was the Package.

Next, return to the Data Flow by either double-clicking the Data Flow Task or clicking the Data Flow tab. Drag a Script Component onto the Data Flow. You will be prompted to select the function of this script component (Source, Destination, or Tranformation). Choose Transformation:

Connect the data path (green arrow) from the Flat File Source to the Script Component:

Double-click the Script Component to open the editor. On the Input Columns page, select Column 0:

On the Script page, enter vdtFileDate into the ReadWriteVariables property:

Click the Design Script button to open Microsoft Visual Studio for Applications and perform the following edits:

  • Beneath the Inherits UserComponent line add Dim myDate As Date
  • In the Input0_ProcessInputRow subroutine add
    With Row
    myDate = CDate(.Column0.ToString)
    End With
  • After the End Sub for the Input_ProcessInputRow subroutine, add a new subroutine:
    Public Overrides Sub PostExecute()
    Me.Variables.vdtFileDate = myDate
    MsgBox(Me.Variables.vdtFileDate.ToString)
    End Sub


IMPORTANT: Do not leave the MsgBox command in deployed code. It will halt execution at this point. The MsgBox is here only to display the variable value has, in fact, been read from the file!

Execute by pressing the F5 key or clicking the green "Play" arrow on the SSIS IDE toolbar. You should see the results in a message box:

I'm certain there are better ways to get a variable value from a flat file. Please comment if you know one!

:{> Andy

Technorati Tags: SSIS Variable File

Simple "Changed Rows" SSIS Conditional Split Expression Generator

When building a dimension load data flow for SSIS ETL, I want to detect changed rows using a Conditional Split Transformation.

To demonstrate, I built a generic package:

The Lookup outer joins to the destination table and returns the surrogate key. If this returns a NULL, the record is new. But what if the row is there and there's no good way to tell it's been modified? You have to compare each and every column in the source and destination rows.

I find it painful to type all that out, so I have created this script which works well if:

  • The column names in the source match the column names in the destination.
  • A consistent aliasing convention is used for the Destination columns (here, I have aliased all my destination columns by adding the prefix "DW_" to each column name).

I just pop it into SSMS, execute it, copy the results to the clipboard...

... and then paste them into the Changed Rows condition of my Conditional Split transformation:

Voila! With some modification, it could do even more.

Here's the script:


declare @SourceServer varchar(255)
declare @DestinationServer varchar(255)
declare @SourceDB varchar(255)
declare @DestinationDB varchar(255)
declare @SourceSchema varchar(255)
declare @DestinationSchema varchar(255)
declare @SourceTable varchar(255)
declare @DestinationTable varchar(255)
declare @Sql varchar(8000)
declare @Output varchar(8000)
declare @DestinationPrefix varchar(25)

set @SourceServer=''
set @DestinationServer=''
set @SourceDB = 'Stage'
set @DestinationDB = 'DataWarehouse'
set @SourceSchema = 'Stg'
set @DestinationSchema = 'DW'
set @SourceTable = 'Table1'
set @DestinationTable = 'Dimension1'
set @DestinationPrefix = 'DW_'

set @Sql = 'USE tempdb;
if exists(select *
from sys.objects
where name
like ''##tmpSourceFields%'')
drop table ##tmpSourceFields;

use ' + @SourceDB +
';select c.name
into ##tmpSourceFields
from sys.columns c
inner join sys.tables t on t.object_id
= c.object_id
inner join sys.schemas s on s.schema_id
= t.schema_id
where t.name = ''' + @SourceTable +
''' and s.name = ''' + @SourceSchema
+ ''';'

--print @Sql
Exec(@Sql);
set @Sql = '';

set @Sql = 'USE tempdb;
if exists(select *
from sys.objects
where name
like ''##tmpDestinationFields%'')
drop table ##tmpDestinationFields;
use ' + @DestinationDB +
';select c.name
into ##tmpDestinationFields
from sys.columns c
inner join sys.tables t on t.object_id
= c.object_id
inner join sys.schemas s on s.schema_id
= t.schema_id
where t.name = ''' + @DestinationTable
+
''' and s.name = ''' +
@DestinationSchema + ''';'

--print @Sql
Exec(@Sql);

set @Output = ''
select @Output = @Output +
'(' + s.name + '!=' + @DestinationPrefix + d.name + ') || '
from ##tmpSourceFields s
inner join ##tmpDestinationFields d on
d.name = s.name;

set @Output = @Output + 'False';

select @Output;

:{> Andy

Technorati Tags: SSIS Expression Changed Rows T-SQL script

posted Tuesday, January 09, 2007 3:17 PM by admin with 0 Comments

Ordering OLEDB Source Adapter Output

There are a couple approaches to ordering the rows delivered by an SSIS OLEDB source adapter.

First, open an SSIS package and add a Data Flow Task to the Control Flow. Open the Data Flow to proceed.

Approach 1 - Inline sorting using the Sort transformation

Configure the OLEDB source adapter to return all rows from a table:

Add a Sort transformation to the data flow and connect the OleDB source adapter to it as shown:

Double-click the Sort transformation to edit it. Select the fields you wish to sort on, in order, and specify ascending or descending sort order:

The problem with this approach is it can slow things down quite a bit if there's lots of data in the table you're sorting.

Approach 2 - Load the data sorted from the source

Configure the OLEDB source to use a SQL Command. Include the desired sort order in the Order By clause as shown:

Click OK to close the OLEDB source adapter editor.

Right-click the OLE DB source adapter and click "Show Advanced Editor...":

Navigate to the Input and Output Properties tab. Click on "OLE DB Source Output" and set the IsSorted property to True as shown:

Next, expand the "OLE DB Source Output" node and the "Output Columns" folder beneath, and click on the desired sort columns.

Set the SortKeyPosition property to indicate both direction and sort order. Direction - ascending or descending - is indicated by the sign of the number (negative = descending, positive = ascending). Sort order is indicated by the value of the number.

Therefore, a SortKeyPosition of -1 indicates [ExpYear] is the first field listed in the Order By clause, and is sorted descending:

Similarly, a SortKeyPosition of 2 indicates [ExpMonth] is the second field listed in the Order By clause, and is sorted ascending:

This method is preferable over the first for performance and scalability.

:{> Andy

Technorati Tags: SSIS Order By Ole DB Sort

posted Sunday, January 07, 2007 12:26 AM by admin with 0 Comments

SSIS Raw File Reader

Simon Sabin has released an SSIS Raw File reader. What a great idea!

I haven't had an opportunity to tinker with this yet, but I plan to test it later this week.

:{> Andy

Technorati Tags: SSIS Raw File Reader Simon Sabin

posted Sunday, January 07, 2007 12:20 AM by admin with 1 Comments

2006 - the Year in Review

This is probably my last post of 2006. It's been a good year. Not perfect, but very good.


got lots of traction in the industry. Most SQL Server technologists I know agree five years was a long time to wait for a new release, but the feature set matches or exceeds the development effort.

Most shops I deal with have either migrated, are testing 2005, or have plans to in 2007.

was released and Service Pack 2 is on its .

- aka Data Dude - went from CTP1 to RTM in six months. Very impressive development cycle!


It's been a good year for the Richmond Developer Community.

We started a new SQL Server Users Group, which is now the official PASS chapter for Richmond, VA! We also held two successful MSDN Code Camps - and the leadership team is planning more for 2007.

Speaking of leadership, the team did an outstanding job in 2007 - thanks to all who led and participated at every level! You folks rock!


Personally, it's been a good year too.

Christy and I bought a house in Farmville, VA - completing our move from Jacksonville, FL back home to Virginia.

We recently learned we're going to be parents again! :)


Business-wise, it's also been a cool year.

I moved from a temp-to-perm position to a permanent consulting gig, and was then recruited by Solid Quality Learning! The relationship with Solid Quality allows me to be an independent consultant. It's nice to be working for me again, although my boss is sometimes a jerk... ;)

I learned a couple difficult lessons as well. Without going into detail, suffice it to say this year affirmed my long-held business standards regarding the importance of integrity, loyalty, and trust. At my age and with my experience with people and in the industry, I am not often surprised by people - but I was surprised this year. My lovely bride Christy has an applicable saying about such times: "Good judgment comes from experience, and experience comes from bad judgment." Amen. I believe it is best to always treat people as you want to be treated because you never know...

I also experienced new levels of trust and respect. I worked with an incredibly talented team on a cool project. Loyalty was a hallmark of our experiences on the team. The result? Against seemingly insurmountable odds and obstacles, both internal and external, we succeeded - and made it look easy! My experiences at Solid Quality Learning have underscored the value of loyalty and integrity in all we do. The professionals that lead this company are at once the most talented, intelligent, down-to-earth, humble, and open people on the planet. It is an awesome honor to be part of this organization!

I was honored several times this year:

  • one of the authors of (Wrox)
  • allowed to participate on the leadership team for the Code Camps
  • honored to lead the Richmond SQL Server Users Group
  • honored to lead the Richmond .Net Users Group
  • nominated for MVP
  • honored to deliver the Team Edition for Database Professionals keynote at the Philadelphia Launch Event
  • honored to be invited to Redmond several times to participate in TAP and certification discussions
  • honored to work with a fantastic team to develop an industry-changing application (which I cannot talk about!)
  • honored to be asked to join Solid Quality Learning as a mentor

I don't do resolutions, I merely set goals for the forseeable future. I was able to accomplish two of three goals I set at the end of last year. I find three is a nice round number for goals - and I am working on my three goals for 2007 this last afternoon of 2006.

Here's to 2007 - may you have a safe, prosperous, and happy new year!

:{> Andy

Technorati Tags: 2006 Year in Review trust Solid Quality Learning integrity new baby 2007

posted Sunday, December 31, 2006 4:44 PM by admin with 0 Comments

Out of Canada...

I'm on the way back to Farmville from Guelph this morning - with mixed emotions.

It will be great to see Christy, Stevie Ray, and Emma when I get home! I always miss them when I'm on the road.

But I also met some good people in Guelph. They were simply awesome to work with (and for). I believe I made some new friends.

It's always good to be able to "geek out" with people doing cool work. I'm very fortunate in that most of the people I work with (and for) are doing cool work. I love this job!

But it's even more cool when I have time to geek out over a couple pints at a good local pub before hitting the road - even if someone had to twist my arm. ;)

:{> Andy

Technorati Tags: Geek out pints pub friends

posted Friday, December 15, 2006 12:19 PM by admin with 0 Comments

You learn something new everyday...

So....

Work finds me in Guelph (pronounced "Gwelf"), Ontario this fine evening. As I type this, I'm awaiting room service - pasta alfredo with chicken and mushrooms. Yum.

I almost didn't get out of the airport this afternoon. But I learned something very important about Canadian Customs: they don't care one whit for (in my part of Virginia, we'd say "they don't cotton to") Americans coming into their country to work! Who knew? Certainly not moi...

At the airport, I was handed a customs document to fill out. It had a couple checkboxes labeled "Why are you here? Pleasure, Business." I'm here to conduct an SSIS class, so I naturally checked "Business." Wrong answer!

After a bunch more questions: "Who do you work for? Myself. Who hired you? I subcontract for global virtual corporation. How many people work for your company? Just me. What kind of work is it? Training. What kind of training? Microsoft SQL Server Integration Services. And why did they call you to train them on this? I wrote part or this book last summer about it. Where were you on the night of June 21st?"

And the looks kept getting meaner...

I was asked to go sit in a waiting area while they "figured this out." I did. After about five minutes, the official returned to tell me "you are offering specialized training." To which I nodded "yes." "You are free to go," he said.

And go I did.

I went right across the street and rented the last Hertz car on the lot (apparently, if you're he last person to get in line at Hertz, they forego the usual "Which model would you like today?" question...). So, what do you know, I learned a couple new things today!

:{> Andy

Technorati Tags: Canada Customs Far from home

posted Sunday, December 10, 2006 8:14 PM by admin with 0 Comments

The Clean Break

For the first time since 2001, I find myself sitting behind the president's desk in the global headquarters of my own business!

The new venture is called Andy Leonard Technologies, Inc. and this my first full-time day on the new office.

I mostly perform work for Solid Quality Learning as a mentor. For those who are unfamiliar with S. Q. L., it's a fantastic company! Not only are the people industry-recognized experts, they're actually cool! They engineer the entire process of joining their ranks so that it's low-stress. It has allowed me to ramp up quickly - and for that I am very thankful.

Mentoring is a great concept - it's a hybrid between consulting and instructing. Here's how it works: I join teams for a number of days or weeks. While working together, we develop a specific set of objectives - usually to develop template projects, best practices, and establish a foundation for the working environment. Together, we build out example projects using the templates to demonstrate their effectiveness.

In addition to this, I'm also a trainer. When training, I lead excellent classroom-based instruction courses. I currently lead the ETL with SSIS course, but I am ramping up on more course material - hoping to lead others.

In my previous jaunt into business, I operated ASI. ASI specialized in industrial automation and integration. It was a lot of fun for me because it brought together several disciplines I enjoyed (and still enjoy!): engineering, electrical control systems design, and software development.

ASI started in 1995 when I wrote one of the first completely web-based Manufacturing Execution Systems (MES) called Plant-Wide Webs. Plant-Wide Webs started using dynamic HTML before DHTML was widely available, then graduated to ASP. Writing the application and running the business were cool experiences!

I learned a lot about business and myself. :)

Most of those lessons were learned the hard way. I remember looking at the checking account one day during my first few months of entrepreneurship and seeing $80 in there. I thought "I must've done something stupid." I was right, and this was the beginning of my understanding of business development.

When Solid Quality Learning called I was ready. I understood the risks of making the leap. I knew it would be a lot of hard work. But, unlike last time, this time I have a fantastic team supporting me - and outstanding business development support!

The person who deserves the most credit has to be Christy. Not only does she support this decision, she's actively involved - booking my flights, making hotel arrangements, and renting cars... she's awesome! She even jumps onto mapping software and talks me in from the airport to the hotel so I don't drive around lost my first night in a new town! (The car rental people always ask "Do you know where you're going?" and look at me funny when I say "No, but I'll find it!") Christy does this in addition to taking care of Stevie Ray and Emma without help from me (when I'm out of town or holed up in the office) - and she does it without complaining.

Thanks, Cutie. I couldn't do this without you!

It feels good to be back. So far, the new boss is treating me ok... but it's still early on the first day... ;)

:{> Andy

Technorati Tags: Andy Leonard Technologies, Inc. Self-employment SQL Server Solid Quality Learning SQL

posted Monday, November 20, 2006 4:14 PM by admin with 0 Comments

Roll your own error-handler in SSIS

I was working on error handling with SSIS recently and came up with what I believe is a snappy way to address it. The solution presented below is partial. For one, I would encourage you to use a database table to house errors; for another, I would encourage you to store lots more data than merely the ErrorDescription field.

Error logging is crucial to troubleshooting and auditing efforts. If an error occurs in your SSIS package while it is executing production loads, transformations, or transfers; you will want to know as much as possible about it.

To demonstrate, create a new SSIS project named ErrorTest:

I encourage you to develop the practice of renaming packages as you create them. This will make your life simpler once you deploy:

I rename Package.dtsx to ErrorTestPkg.dtsx (don't change the extension):

When you rename a package the following dialog will appear:

Always answer this dialog "Yes".
Drag an Execute SQL Task onto the Control Flow:

Configure it (double-click or right-click and select Edit to open the editor) as shown below. Note any connection will suffice, so long as the SQL Server is running. Also note the SQL will generate an exception (intentionally):

Click OK to close the editor. Right-click the Task and select Execute Task to test:

The task should fail (as planned):

Stop execution and click on the Event Handlers tab:

Check to be sure the Executable dropdown is set to the ErrorTestPkg (the package) and the Event handler dropdown is set to OnError. Click the link to create a new OnError Event Handler for the package. From the Toolbox, drag and drop a Data Flow Task onto the Event Handler:

Double-click the Data Flow Task to edit it. From the toolbox, drag a Script Component transformation onto the Data Flow:

The Script Component can be configured as a Source, Transformation, or Destination. Select Source:

Since a Source has only outputs (a destination has only inputs, and a transformation has both inputs and outputs), you will need to configure the Output. Rename the generic Output 0 to ErrorOutput:

Click on Output Columns and click Add Column:

Rename the Column ErrDescription:

Edit the ErrDesription column's properties - change the datatype to String (DT_STR) and the length to 5000:

Add the ErrorDescription System variable to the ReadOnlyVariables property of the Script Component:

There are other variables you will add to your error handling script component. Remember variable names are case-sensitive, comma-separated, with no spaces in the list.
Click the Design Script button and add the following code snippet as shown below:


        With ErrorOutputBuffer
            .AddRow()
            .ErrDescription = Me.Variables.ErrorDescription
        End With


Some explanation: The name of the ErrorOutputBuffer was determined by the name you gave the Script Component Output earlier. When you changed the Output name to ErrorOutput, the script component's output buffer was changed to ErrorOutputBuffer.

To populate the buffer, you must add rows to it. Think of this as starting a new row in a table. If you're familiar with tables in Microsoft Word documents, you know when you start a new row it is empty until you populate it. The same is true here. You are creating a new, empty row in the output buffer to populate with data.

The final step is to load the row. Again, in your production-ready error event handler you will populate more columns in the output buffer than this one - this is for demonstration purposes only:

Next, drag a flat file destination onto the Data Flow and connect it to the Script Component as shown:

Note: You will likely want to store error data in a database rather than a flat file.

When you configure a flat file destination, you are prompted for which type of file. Delimited files are fairly simple to read - especially with few fields:

Select a location and name for your flat file. I chose C:\ErrorOutput1.txt:

Click Columns to view the columns the file will contain. Note: the columns would be empty if you had not connected a data path from the Script Component prior to configuring the Flat File Destination:

Click OK to proceed. Click the Mappings page to assign mappings between fields in the pipeline and fields in the destination:

Since you built the destination from the metadata contained in the pipeline, auto-mapping should occur when you click Mappings:

Click OK. Your Event Handler should appear as shown:

Start debugging the package by pressing F5 or clicking the Debug Run arow. Your ExecuteSQL task should fail as before:

Click on the Event Handlers tab to determine the run status of your Package OnError Event Handler. Double-click the Data Flow Task and note one row was written to your error file (ErrorOutput1.txt):

Open ErrorOutput1.txt file to view the error stored there:

In this example, we built a package to demonstrate logging package errors to a data destination - albeit a flat file. This technique can be used to catch errors and store them for troubleshooting posterity.

:{> Andy

Technorati Tags: SSIS OnError Event Handlers Logging

posted Thursday, November 16, 2006 1:21 AM by admin with 0 Comments

FAQ Friday

I was honored to be G. Andrew Duthie's guest today on FAQ Friday. (I know, "next time post something before the event, Andy.")

Thanks Andrew for asking to be on the webcast!

We talked about SQL Server Integration Services, Team Edition for Database Professionals, Team Foundation Server, and SQL Server 2005. We managed to get in some tips and tricks for the query window in SQL Server Management Studio, including viewing and executing SQLCmd statements in an SSMS query window.

FAQ Fridays rock!

:{> Andy

Technorati Tags: FAQ Friday SQL Server Integration Services Management Studio SQLCmd

posted Friday, October 27, 2006 4:16 PM by admin with 0 Comments

RCC2-10 days: Countdown to Code Camp

Richmond Code Camp 2 approaches!
Have you ?

Only 1010 (decimal, lest you bit-heads think I mean there are only 2) days remain until the second Richmond MSDN Code Camp! (That's 10102 days for youwe bit-heads.)

The schedule has been posted!

:{> Andy

Technorati Tags: Developer Community Code Camp Richmond, VA

posted Wednesday, September 27, 2006 11:27 PM by admin with 0 Comments

Load a package variable with a dataset in SSIS

Occassionally in SSIS, you need to load a dataset into a package variable. To do this, first create a variable:

Next, drag an Execute SQL Tak onto the Control Flow canvas:

Double-click the task to edit it. Set the Result Set property to Full Result Set. Choose a Connection and enter some SQL:

Navigate to the Result Set pane and click the Add button to create a new Result Set mapping. Name the Result Set 0 and assign it to the variable you previously created.

This will populate the variable with an ADO.Net dataset.

:{> Andy

Technorati Tags: SSIS Variables DataSet

posted Sunday, September 24, 2006 1:48 AM by admin with 0 Comments

Free! (for 180 days...)

I regularly speak to folks who tell me they wish someone would give them an MSDN subscription so they could play with all the cool new technology. I sincerely hope every single person with this desire gets their wish - if not from an individual (as happened to me), from their company. An MSDN subscription is a cool thing.

Until then, I recommend folks do what I did: get the trial versions!

You can build yourself a handy little enterprise on a workstation that has 1G RAM and some hard drive space. It won't be fast, but you can learn cool things nonetheless. Here's how:

1) Download and install  - it's completely free (as in beer). With this snappy software you can create virtual machines on your workstation and install any operating system you desire. Use Virtual Server to do just that - build a virtual machine before proceeding.

2) Download the 180-day trial version of . Install this as the operating system on your new virtual machine.

3) Download the 180-day trial version of . Install this on your new VM.

4) Download the 180-day trial version of . Install this on your new VM.

There you have it - a development virtual workstation that will allow you to learn and grow and try cool new things, for the next six months, at least!

:{> Andy

Technorati Tags: Developer Community VS2005 Visual Studio SQL Server SQL SQL2k5 Virtual Server MSDN Windows Server 2003 2005

posted Wednesday, August 23, 2006 10:34 AM by admin with 0 Comments

Tuning SSIS

Elizabeth Vitt - along with Donald Farmer, Ashvini Sharma, and Stacia Misner - provide an excellent look inside the SSIS engine in this article: .

:{> Andy

Technorati Tags: SSIS performance tuning

posted Tuesday, August 08, 2006 12:29 AM by admin with 0 Comments

Donald Farmer is blogging again

Donald Farmer, Microsoft's Group Program Manager for SQL Server Business Intelligence and Integration Services, is blogging again!

Mr. Farmer is a legend in the business intelligence community. Even so, he's always made time to answer questions from this aspiring author.

His post entitled Tales of Two Bills is an interesting read.

:{> Andy

Technorati Tags: Developer Community Donald Farmer SSIS blogs

posted Sunday, July 16, 2006 6:14 PM by admin with 0 Comments

Odd Error in SSIS

I'm unable to find anything online about this, so maybe someone out there has seen something similar:

Sometimes, when attempting to execute an SSIS package inside the IDE, the package fails to execute and I see a momentary (second-tary, actually) flash of a command shell titled SQLDumper.exe.

It happened this morning when firing a package that reads data from one local database, de-normalizes it, and writes it to another local database.

I may post this on the MSDN Forums. If someone gives me the answer, I will update this blog.

:{> Andy

Technorati Tags: SSIS SQL Server SQLDumper

posted Monday, June 12, 2006 11:42 AM by admin with 0 Comments

Reston Code Camp Rocks!

I just arrived home (literally) from the third Reston Code Camp. It was awesome! G. Andrew Duthie, Brian Noyes, Vishwas Lele, and company sure know how to organize a code camp! :) Great job guys!

I was honored to present Beginning SSIS Development early, then host a Chalk Talk about Agile Database Development Practices after lunch. Right after that session, I presented SSIS Development with Team System - which went well until my Virtual PC finally gave up the ghost... durnit! I covered with some SSIS development tips and tricks.

I learned a lot about Smart Clients from Brian's sessions. I'll have to read some of his books as soon as possible. I also enjoyed Vishwas' session on Atlas - some neat stuff there!

Frank La Vigne was also in attendance, and presented on Tablet PCs. Frank is the Tablet man!

Code and the decks from my two presentations is available for download at the Mad Code Camp website.

Congratulations to the Reston Code Camp Team for such a successful event!

:{> Andy

Technorati Tags: Reston, Va Code Camp Developer Community

posted Sunday, June 11, 2006 12:31 AM by admin with 0 Comments

Preparing for the Reston Code Camp!

I'm pretty psyched about heading to Reston tomorrow for the Reston Code Camp!

I get to meet some really cool people. And I'm looking forward to sitting in on a few sessions and learning some cool stuff! I'll have a few copies of on hand for swag.

This is also the weekend we begin moving into our new house - another thing to be psyched about!

If you read this blog and attend the Reston Code Camp tomorrow, please introduce yourself. Hope to see you there!

:{> Andy

Technorati Tags: Reston, Va Code Camp Developer Community SSIS

posted Friday, June 09, 2006 2:22 PM by admin with 0 Comments

June 8 2006 - SSIS Tips and Tricks Presentation

Last night's SSIS Tips and Tricks presentation to the Richmond SQL Server Users Group was lots of fun!

I tried a couple things different: First, it was nearly all code. I had one slide at the beginning of the presentation and then just dove into code for the rest of the evening. Second, I used Camtasia Studio to record segments of the presentation.

The first recording on SSIS Constraints is available online. and let me know what you think!

:{> Andy

Technorati Tags: Developer Community SQL Server Richmond, Va SSUG SSIS Constraints Camtasia

posted Friday, June 09, 2006 10:24 AM by admin with 0 Comments

Speaking Engagements: June 2006

In June, I'm speaking at the following locations / dates:

1 June 2006 - "Scrum with Visual Studio Team System" to the Richmond .Net Users Group.
8 June 2006 - "SSIS Tips and Tricks" to the Richmond SQL Server Users Group.
10 June 2006 - "Beginning SSIS Development", "SSIS Development with Team System", and a chalk talk on "Agile Database Development Practices" to the Reston, VA Code Camp.
27 June 2006 - "Scrum with Visual Studio Team System" to the Capital Area .NET Users Group.

If you read my blog and attend one of these events, please introduce yourself and let me know!

:{> Andy

Technorati Tags: Developer Community Code Camp Speaking SSIS SQL Server Team System Team Foundation Server

posted Saturday, May 20, 2006 11:01 PM by admin with 1 Comments

Configure Reporting Services to consume SSIS packages

A really cool feature of SQL Server 2005 is the ability to use a SQL Server Integration Services (SSIS) package as the source for a SQL Server Reporting Services (SSRS) report. There's an example in chapter 17 of the Wrox book .

Before you can take advantage of this functionality, you have to make a couple changes to some SQL Server config files.

Note: SSRS and SSIS must be installed on the same server to utilize this functionality.

Important Note: There are security implications to consider. Please see  for more information.

Navigate to C:\Program Files\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies (assuming you installed SQL Server on the C: drive) in Windows Explorer. Open the RSReportDesigner.config file in a text editor:

You can use any text editor - or even the Visual Studio 2005 IDE. Call me old-fashioned, I like Notepad. :)

Remove the comments from the config sections (there are two) beginning with the phrase: :

Don't forget the End Comment markup ("-->") at the end of each commented line. 
Save the file and close Notepad.

Next, navigate to C:\Program Files\Microsoft SQL Server\MSSQL.n\Reporting Services\ReportServer and open the rsreportserver.config file:

Note: On my demo machine, the file is located in ...\MSSQL.3\.... This will vary depending upon the number and type of SQL Server 2005 family of products you have installed on your machine.

Again, uncomment the section beginning with the phrase: :

 Save the file and close it.

Now, when you create an SSRS data source, you have the option of using SSIS as the data source:

How does this work? When your report refreshes, Reporting Services will call and execute the SSIS package. An important security consideration is the SSRS service account provides the security context for the SSIS package. For this reason, it is a good idea to hard-code SSIS connection string security data - you do not want to give the SSRS account the same permissions as you give an SSIS package.

:{> Andy

Technorati Tags: SQL Server 2005 SSIS SSRS Reporting Services Integration Services Data Source

 

posted Saturday, May 20, 2006 3:46 PM by admin with 0 Comments

Dynamic SQL in SQL Server Integration Services (SSIS)

I sometimes miss the Dynamic Properties Task in DTS. I used it for all sorts of functionality - and I'm struggling to replace some of that flexibility in complex SSIS development.

I recently developed a package with dynamic SQL serving as the data source for a Data Flow. I accepted the mission and here's an example to demonstrate how I accomplished this:

Drag a Data Flow Task onto the Control Flow canvas:

Double-click the Data Flow task to edit it. Drag a DataReader Source onto the Data Flow canvas and double-click it for editing. The Advanced Editor opens. Select or create an ADO.Net connection manager on the Connection Managers tab:

On the Component Properties tab, enter an SQL Statement in the SqlCommand property (I query a table I created named dbo.TestTable which contains three columns: TestID [Identity PK], TestName [VarChar(20)], and TestValue [int]):

Verify the SqlCommand field names on the Column Mappings tab:

Drag a Flat File Destination onto the Data Flow canvas and (this is important) connect the output of the DataReader Source to the Flat File Destination with a Data Path (green arrow) [Note: Previously, I incorrectly identified this as a Precedence Constraint. Precedence Constraints are found only on the Control Flow. Data Paths are the connecting arrows on the Data Flow. My apologies for any readers misled.]:

Double-click the Flat File Destination to edit it. On the Connection Manager page, click the New button to create a new Flat File Connection Manager:

Assign the Flat File Connection Manager a name (I used "OutFile") on the General page. Enter a filename ("OutFile.txt") and specifics regarding collation and row delimiters:

Click Columns to configure column details.

Why? I always create Flat File Connection Managers for new files from the Flat File Destination task. If the target file does not exist, the Destination task has no better way to know which columns to expect.

Note the Preview grid displays the expected columns (read from the DataReader Source via the precendence constraint connector):

Click OK to create the Connection Manager and return to the Flat File Destination Editor:

Click the Mappings item in the page list to view (and auto-map) the DataReader Source columns to the Flat File Destination columns:

Return to the Control Flow tab and click any empty space on the Control Flow canvas.

Why? You are about to create a new Variable. Unlike DTS variables (which were all global in scope), SSIS variables have specific scope. Scope is determined, for better or worse, by the Task or SSIS Object that has focus when you create a new variable. I say "for better or worse" because I've not yet discovered a method for changing variable scope during the variable creation process. Clicking on the Control Flow canvas gives it focus, and Package-scoped (global) variables may then be created.

Click View, then Other Windows, then Variables to display the Variables dialog. Set the Data Type to String and enter some query in the Value field. Note: the query used here must expose the same columns as the query in the SqlCommand property of the DataReader Source or the Data Flow will fail during pre-execution package validation (There is most likely a method to address this - more later, perhaps...). I created a variable named SQLOut with a default value of "SELECT 0 AS TestID, 'Zero' AS TestName, 0 AS TestValue":

Drag a Script Task onto the Data Flow canvas. Connect it via Data Path (green arrow) [Note: Previously, I incorrectly identified this as a Precedence Constraint. Precedence Constraints are found only on the Control Flow. Data Paths are the connecting arrows on the Data Flow. My apologies for any readers misled.] to the Data Flow task as shown:

Double-click the Script Task to open the Script Task Editor. Click on Script in the page list and add the variable name created earlier (SQLOut) to the ReadWriteVariables property:

Click the Design Script button to open the Visual Studio for Applications (VSA) Script Editor. Enter the following script in the Main subroutine of the ScriptMain class:

MsgBox(Dts.Variables("SQLOut").Value)

Dts.Variables("SQLOut").Value = "SELECT 1 AS TestID, 'Unos' AS TestName, 12 AS TestValue"

MsgBox(Dts.Variables("SQLOut").Value)

Dts_TaskResult = Dts_Results_Success

To test, close the VSA Editor and click the OK button to close the Script Task Editor. Right-click the Script Task and click Execute Task:

The first MsgBox command should execute displaying the default contents of the SQLOut variable:

Click the OK button. The second MsgBox command should display the updated SQLOut variable contents:

Click the OK button. The Script task should complete without error:

Announcing the Richmond SQL Server Users Group!

Frank La Vigne and I are proud to announce the formation of the Richmond SQL Server Users Group!

The first meeting is scheduled for 6:30 PM 11 May 2006 at 4600 Cox Road, Glen Allen, VA. I'll be speaking, and since this is the first meeting I think I will start at the beginning - with a presentation on installing and configuring SQL Server 2005.

We're working on a website and hope to have it online sometime this month.

Special thanks to Fahrenheit Technology for sponsoring our inaugural meeting!

I hope to see you there Thursday!

:{> Andy

Technorati Tags: SSIS Sql Server Developer Community Richmond Virginia

posted Sunday, May 07, 2006 1:58 AM by admin with 0 Comments

SQL Server Integration Services Conditional Precendence

SQL Server Integration Services offers vast functional enhancements over DTS. One impressive change is the flexibility of new Precendence Constraints.

Right-clicking a precedence constraint reveals a context menu:

Clicking Edit reveals most of the new functionality. To demonstrate, let's walk through a simple example.

First, add a new variable named Now as shown below:

Drag an Execute SQL Task onto the Control Flow workspace:

Double-click the task to open the editor. Enter some generic SQL in the task as shown:

Change the Resultset fron "None" to "Single row": 

Click the Result Set navigation link. Add "Now" as the Result Name and "User::Now" as the Variable Name. This maps the single row returned by the query to the user variable name "Now" - which assigns the current server time to the [User::Now] variable:

Test the package by clicking the Run button (or F5). The package fails with the following error:

Clicking the Execution Results tab reveals the error to be related to the Result Binding:

  To correct the error, add an alias ("Now") to the query:

The package now succeeds:

Add two more Execute SQL Tasks. Configure them as shown below. Change the Name property of the second task to "Execute SQL Task 31 - 60":

Drag a new precendence constraint from the first task to the second. Right-click the precedence constraint and select Edit:

Change the Expression operation from "Constraint" to "Expression and Constraint":

Leave the Value set to Success. Enter "DatePart("s", @[User::Now])<=30" in the Expression textbox. Leave the Multiple Constraints option set to Logical AND - this means the previous task must succeed (the Success value must be satisfied) and the Expression must evaluate to True (the seconds of the current minute must be between 0 and 30 inclusive) for this precedence constraint to execute. Click OK to close the precedence constraint editor:

The precendence constraint now displays an fx symbol to indicate it is controlled by an Expression:

Drag another precendence constraint from the first Execute SQL Task to the third (31-60) task. As before, enter "DatePart("s", @[User::Now])>30" in the Expression textbox. Leave the Multiple Constraints option set to Logical AND - this means the previous task must succeed (the Success value must be satisfied) and the Expression must evaluate to True (the seconds of the current minute must be between 31 and 59 inclusive) for this precedence constraint to execute. Click OK to close the precedence constraint editor:

The second precendence constraint now indicates an Expression. Click the Play (F5) button to start execution. Depending on which half of the current minute the execution occurs, one of the two images below will display indicating that precedence constraint's Expression to evaluates True. Clicking Play 30 seconds later will cause the other precedence constraint's Expression to evaluate True:

Precedence Constraints in SSIS provide lots more functionality than their predecessors (no pun intended!).

:{> Andy

Technorati Tags: SSIS Sql Server Precendence Constraint

posted Sunday, April 30, 2006 10:28 PM by admin with 0 Comments

Last Minute Fill-In Speaking Gig

I had an opportunity to speak to the Richmond .Net Users Group last night as a last-minute fill-in guest speaker.

The folks in attendance were simply awesome! It was an honor (as always) to be able to present to a cool group of folks. Frank LaVigne, MVP (congratulations Frank on being awarded the 21st Tablet PC MVP on the planet!) and I had concerns about presenting a database-centric topic to a developer group. But it went very well. It turns out there are a lot of developers delving into SQL Server 2005 - either by choice or necessity.

"Hey you!
Do you know how to spell SQL?
Good.
You're our new database resource."


You laugh... this is a paraphrase of the discussion that converted Andy from a web developer to a SQL Server DBA...

I'm working on more Team System presentations for upcoming events, like the Richmond Code Camp. I will be presenting "SSIS with Team System" there and at upcoming user group engagements.

Plus, Frank and I are (re-) starting the Richmond SQL Server Users Group. The first meeting is planned for May 11, 2006 and we plan to hold monthly meetings the 2nd Thursday of each month. More details will follow...

:{> Andy

posted Friday, April 07, 2006 9:36 PM by admin with 0 Comments

Interface Not Registered / Class Not Registered Error in SSIS

I ran into a couple errors recently when trying to create new SQL Server Integration Services (SSIS) projects. One error stated:

Failed to save package file "C:\Documents and Settings\Administrator\Local Settings\Temp\1\tmp2B.tmp" with error 0x80040155 "Interface not registered".

The other stated:

Failed to save package file "C:\Documents and Settings\Administrator\Local Settings\Temp\1\tmp2B.tmp" with error 0x80040154 "Class not registered".

These errors were encountered on a new laptop and a new virtual PC. Both had Visual Studio 2005 installed, so I suspected some sort of conflict. On the VPC, I loaded VS 2005 first, followed by SQL Server 2005 Developer. I noticed the client tools didn't install, although I thought I'd checked that box. To correct, I executed the setup for client tools. When completed, my client tools were available and creating an SSIS project succeeded.

But the laptop wasn't so simple. I thought "I have a solution!" and went about re-installing the client tools, but it didn't correct the issue!

So I popped on Google and searched for similar issues. I found a helpful post at MSDN Forums: . Executing two RegSvr32 commands did the trick:

regsvr32 msxml3.dll
regsvr32 msxml6.dll 

When all else fails, read the instructions!

:{> Andy

posted Thursday, March 30, 2006 6:36 PM by admin with 0 Comments

April Speaking Engagements

My "Buy The Book 2006" tour rolls on! :)

In April, I will be speaking at the following locations:

  • 12 Apr 2006 - Triangle .Net User Group
  • 20 Apr 2006 - Roanoke Valley SQL Server Users Group
  • 22 Apr 2006 - Richmond MSDN Code Camp!

If you're going to be in the area one of those dates, please stop by and introduce yourself as someone who reads this blog!

:{> Andy

posted Thursday, March 30, 2006 6:36 PM by admin with 0 Comments

NoVa SQL Server Users Group

I was honored to present a session on Beginning SSIS Development at the NoVa (Northern Virginia) SQL Server Users Group last night. I met some great people and learned a lot.

I can hear you thinking "What? You learned a lot?" Absolutely! I always learn when presenting. In fact, I would argue presenting / teaching is the best way to learn.

Presenting forces me to remain knowledgeable in my field. And it provides immediate feedback about my success at ths endeavor.

Any new technology - especially one as flexible and powerful as SSIS - will be used in unanticipated ways. This will generate unexpected results, which will in turn generate perplexing questions to presenters - usually in front of a large crowd! :)

It is all good.

It's called "learning." Experience has taught me learning is best done in groups. No one can master it all and not all at once. This is why teaching teaches.

Thanks, Nova SSUG, for the opportunity and knowledge shared!

:{> Andy

Adventures in SQL Server 2005 Replication

So... I'm on a call yesterday afternoon and the topic of replication comes up. Before I can catch myself, my kneejerk response spills out "There has to be a better way to accomplish this than replication!" My colleague is taken aback by the statement. "You're the SQL guy" he says... and I am left to ponder why I reacted the way I did.

So... I ponder it most of the evening and again this morning. And I've reached a conclusion: I reacted because of the trouble I've experienced in the past making changes to SQL Server 2000 published databases.

About half the time, if memory serves, I ended up poking around in a sys% table turning bits off so I could actually make the desired change. This, after dropping the publication while standing on one leg facing East. Yes, it was tricky. And I never attained that warm fuzzy feeling of comfort that accompanies knowing what to expect from the technology.

But it isn't fair to judge SQL Server 2005 by the same standard - which would place me in the same camp as those java folks complaining about things Microsoft fixed last century (a place I choose not to camp).

So... I'm walking through setting up a local pushed publication on my laptop - just to see what it's like.

Issue 1:

Clicking Report, View Report reveals:

Resolution: Change the SQLAgent service properties and start the service:

On to Attempt #2...

Issue 2:

Clicking Report, View Report reveals:

Now I remember Brett, the network guy, telling me I have access to the domain, but I am not a member of the domain. I wonder if this has anything to do with this issue. Hmm...

I check my account properties - I have a local account and I'm a local administrator:

But this isn't the captechventures\aleonard - this is the local aleonard. Checking the local Administrators group, aleonard is in there, but no domain account:

Resolution attempt 2: Since I'm not part of the domain, adding my domain account is out (I can't see it). But I'm prompted to enter a domain\account when setting up the publication... what to do, what to do:

Resolution: Since I cannot see the domain, I use my local machine account credentials.

Success!

I need to work on a method to test the scalability of this architecture, but Step 1 is complete.

:{> Andy

Technorati Tags: Sql Server 2005 Sql Server Replication

posted Tuesday, March 21, 2006 6:32 PM by admin with 0 Comments

On Book Authoring (for the first time), part 5

is now on book shelves.

The work isn't over, by any measure. There are updates to maintain. The authors publicize the book at speaking engagements and conferences. In this case, it's not because it's required by anyone (no one's told me I'm required to promote the book anyway...) - I'm proud to be part of the team that authored this work.

I believe we formed a bond while working together. I certainly made new friends - most notably Haidong Ji - whom I enjoyed chatting and swapping emails with during the writing and editing phase.

I was privileged to be employed alongside several of the other authors during the writing: Brian Knight, Douglas Hinson, Jason Gerard, and Mike Murphy. Writing together while working together was a blast! At various times I believe we all took turns serving as coach, mentor, editor, encourager, co-conspirator, comiserator, brother and friend. It was a team effort I will treasure the remainder of my life.

The book has received awesome accolades - both public and private. My couple chapters would not have been possible without lots of help from others - some of whom were not mentioned in the credits section of the book. First and foremost, Jeff Beehler provided support "above and beyond" both any duty and my expectations. Without his help, the chapter about SDLC would not have been possible. Thanks Jeff!

Ashvini Sharma and Donald Farmer provided moral support, encouragement, and unfettered access to the SQL Server Integration Services team.

Kamal Hahti deserves recognition for an inspiring demonstration that motivated me to dig into some of the atypical functionality exposed by SSIS. This proved especially useful while writing the chapter on interfacing with external applications.

:{> Andy

posted Tuesday, March 14, 2006 6:32 PM by admin with 0 Comments

February speaking engagements

I will be delivering a Beginning SSIS Development presentation at the Baltimore SQL Server User Group meeting 1 Feb 2006, and at the Hampton Roads SQL Server User Group 16 Feb 2006.

Beginning SSIS Development covers:
 - Introduction to the new SSIS Integrated Development Environment
 - Control Flow and Data Flow discussion
 - An overview of controls, connections, and package flow
 - Troubleshooting techniques
 - A demonstration (or two, depending on time)
 
Details and directions are available at the respective websites. They should be fun evenings - I always enjoy presentations! If you're in the area and read this blog, stop by and introduce yourself!
 
:{> Andy
posted Wednesday, January 11, 2006 6:20 PM by admin with 0 Comments

Production Virtual Servers?

I recently documented building a virtual server for use with the Team Foundation Server Dec 2005 CTP. It was a lot of fun / work - and it's documented at VSTeamSystemCentral.com.

An opportunity at work prompted use of another virtual server to facilitate a data-transformation-intense software migration. In order to make this particular migration work, I needed to install an old ODBC driver. I did not have access to the driver installation - apart from installing a suite of software products purchased back in the day. I needed to land the migrated data on an existing test server and I did not want to corrupt other software currently installed there.

So... I built a VPC, installed Windows and SQL Server 2k. I loaded the legacy software package, which installed the driver, then built some DTS packages to migrate the data from the source to my target server. Voila!

It's worked wonderfully. Among the many benefits is the low impact to our existing Development / Test environment.

This experience - coupled with clustering functionality built into Virtual Server 2005 R2 - has me seriously considering Production Virtual Servers. Thoughts?

:{> Andy

How to build a Team Foundation Virtual Server (TFS)

As promised, you can learn more about my Team Foundation Server setup experience at VSTeamSystemCentral.com.

I have started posting a series of articles (Team Foundation Virtual Server) describing the process I used to build and configure a virtual PC similar to the VPC Microsoft distributed with their releases of Visual Studio 2005. At the time of this writing, 6 of 12 to 15 articles have been posted. I attempted to capture screenshots for every step I used to build a functional TFS server.

I do not provide software. The series assumes you have access to an MSDN subscription or licensed copies of Windows 2003 Server Enterprise, SQL Server 2005 Developer, Visual Studio 2005 Team System (or the trial version), and Microsoft Virtual PC 2004 SP1.

Registration is required to access the content.

:{> Andy

posted Thursday, December 22, 2005 6:12 PM by admin with 0 Comments

Team Foundation Server - Dec CTP

I just finished installing the Dec CTP of TFS.

I haven't tested the installation yet (that's the task du jour), but I am quite impressed with the installation. Concise instructions, well-written and well-formatted help. And (for me at least), no ugly and unexpected dialogs.

I installed on a VPC running Windows 2003 Server enterprise configured as a standalone AD / DNS server on its own local domain. My only issue was with SQL Server Surface Area configuration: I needed to expose TCP for communications. The pre-installation checks caught this and warned me before the TFS installation even started - which was really nice.

All in all, installation of the Dec CTP was a positive experience. Now, on to testing...

 

:{> Andy

SQL Server 2005 and Team System

For arguably the first time ever, DBAs have a stable Microsoft development environment with integrated source control.

There's a chapter in the upcoming Wrox Press book "" on this very topic. It serves as an introduction to some source control and project management features available to DBAs using SQL Server 2005.

There's lots more information available online about Team System - I find most of my information on blogs:

There is also a Wrox Press book due out in February entitled "".

"Why should we care about Visual Studio, Andy?" I'm glad you asked. If you haven't yet tinkered with the new SQL Server 2005 development environments, you may not know this: When you develop SQL Server applications in SQL Server 2005, one of the tools available to you is the Visual Studio 2005 environment. In fact, if you develop business intelligence applications (Analysis Services, Integration Services, etc.), you utilize a tool called the SQL Server Business Intelligence Development Studio (BIDS), which is a version of Visual Studio with tools for developing SQL Server business intelligence applications.

If Visual Studio is a new environment for you, learning about it will make you a more effective DBA when you move to SQL Server 2005.

If you don't use a source control product currently - or even if you do - I encourage you to examine the new version of Visual SourceSafe (2005) and Team System.

:{> Andy

On Book Authoring (Part 4)

Now for the fun part: the work is done.

For me, the work ended with one last all-nighter. I was a week late on the copy-edits and was basically holding up the book. It was definitely not intentional, but facts are harsher realities than intentions.

"There I was..." in the middle of moving myself and my family from Jacksonville, Florida - our home of 3-1/2 years - to Virginia. The purpose of the move was to get my family closer to our extended families in Virginia. I had relocated my wife, children, cats, and most of our stuff and was preparing to fly out of Richmond for one last day of packing and work. Sitting in the Richmond airport, I popped open my laptop, connected to the free airport wireless service (is anyone in Atlanta reading this?) and began checking email. I hadn't checked email in a few days, so there were a few messages waiting. Some marked "URGENT" - from people you don't want to receive URGENT messages when you're writing.

It wasn't a pretty situation. Months earlier, I'd snagged my screenshots using Beta 2. I saw a webcast earlier in the week about Team System (one of my chapters is an introduction to SDLC for DBAs), and the wizards looked really different. So I knew I would have to snag new screenshots. But to snag new screenshots I needed access to the Release Candidate 1 (RC1) applications, which I did not have. This meant the book could go to press with inaccurate screenshots or the chapter could be pulled. Not good.

Luckily, some good folks at Microsoft responded to my repeated barrages of tear-stained spam and provided a copy of the RC1 software. It turned out a disk had been FedEx'd to my home in Jax - and that email was among those I read while sitting at RIC awaiting my delayed flight. As I was proof-reading my response to the URGENT message, stating the material I needed to complete my work was waiting for me in Jax, a followup URGENT message arrived - this one with an appended PLEASE RESPOND IMMEDIATELY in subject line.

I stopped proof-reading and clicked Send.

Don't get me wrong, I am not complaining. These folks had more than enough reason to be demanding, upset, and worried. I gave every indication of dropping the ball, and the publishing business is extremely time-critical in nature. Drop a few balls and you have no book to show for lots of time and effort. So I understood.

When I arrived in Jax (late), my FedEx was propped against the front door. I opened the package and began snagging and editing. I snagged into the wee hours of the morning, then into the morning, then past the time I was supposed to show up for my last day at work. I worked into the early afternoon and finally it was done. And accurate.

I shipped it off to the editors for another final once-over, put in an appearance at work to say some tough good-byes, headed back to the house, paid the carpet cleaner, packed up my old pickup, grabbed four hours of sleep, then aimed the old Jeep north for 11 hours of interstate travel. It was the hardest thing I've done in years. Was it worth it? Are you kidding?

Amazon has the book listed with  showing the authors. It's almost showtime!

:{> Andy

On Book Authoring (for the first time), part 2

In an earlier post, I shared some revelations I experienced as I participated in authoring a couple chapters for an upcoming SQL Server 2005 Integration Services book. I feel inclined to share some more, so here goes:

1. It's a lot of work. I've heard that from authors before, and I suppose it just didn't take. An experienced author and very good friend shared that he wouldn't wish writing on his worst enemy. I understand the sentiment and the editing has just started - I'm certain there's more to come.

2. It's a once-in-a-decade (-perhaps-lifetime) honor and opportunity. I've been reading books published by this publishing label for a decade. I always admired the style and content of their books. Some were better than others but all in all, they publish cool stuff. To have an opportunity to write for them is humbling and amazing all at the same time.

On balance, the work is well worth the honor and opportunity.

:{> Andy

posted Monday, August 08, 2005 5:57 PM by admin with 0 Comments

Which "flavor" DBA are you?

   I received a cool compliment today from a peer who's a developer. He said, "You know, I really like having a DBA on my team!" I have to tell you, it made my whole day!

   It led to a discussion about past experiences and expectations, and I shared something I thought was pretty much common knowledge: there are three types of DBAs. My peer was shocked, so maybe the knowledge isn't so common after all.

   The three "flavors" of DBAs I define are:

  1. System, Operations, or Production Support DBAs - these DBAs write maintenance plans in notepad and have no qualms whatsoever about executing in command-line. They were DBAs in the old days, when we carved our own ICs out of wood. They will get your server and database back online fast - and with less data corruption than anyone else on the planet. They live for torn pages and I/O faults.
  2. Application Support DBAs - these DBAs are familiar with one or more (usually complex) applications. I'm talking PeopleSoft, Seibel, and SAP here. If you want to customize a screen or write a one-off web application, you desperately need these folks.
  3. Database Developers - these DBAs are ruthless bit-heads. They use bigint and byte fields for masking binary states. They can optimize a stored procedure in their sleep and wrap an API around a database so developers never have to consider writing SQL that directly hits tables. They are performance freaks that will work 18-hour days on weekends to test in Production - when it's "safe."

   Do you think DBAs fall into these categories? Do you know any that do? Do you see yourself in there anywhere? Do you have more or less or different "flavors" for classifying DBAs? 

:{> Andy

Technorati Tags: Sql Server DBA flavor