Analysis Services via HTTP

May 30, 2008

UPDATED: Added more information on permissions.

It is pretty common policy to seperate data servers from web servers.  Microsoft may be making headway with making IIS more secure, but IT departments commonly forbid having the two on the same production server.

Microsoft has provided us with a very simple tool to use in this situation.  They call it the Analysis Services Data Pump.  This ISAPI extension has existed (I think) since SQL 2000, but I recall it had a different name.  It is a dll hosted by IIS that takes server requests on HTTP and forwards the requests to the Analysis Server.  The server responds to the dll with results and the results are forwarded back to the requestor.  Here is the link to the article that explains how to set it all up:

The only thing I would add is that the article has a lot of different directories they reference:  D:\tmp\isapi and C:\InetPut\wwwroot and C:\wwwroot.  All of these should be the same place.  I don’t see any reason that you couldn’t just create a virtual directory in IIS that points directly to %InstallPath%\OLAP.2\OLAP\bin\isapi so that you don’t have to make another copy of the dll.  The only reason I can think to make a copy is that it would isolate you from being affected by an update to SQL server if you didn’t want the file to be updated without you testing it.  I still don’t know if this is enough of a reason to make a copy.  Don’t forget that if you don’t copy into the wwwroot directory, you’ll need to make sure your access permissions are correct.  Anything that goes in the wwwroot folder, IUSR automatically has access to it.  If you don’t put it in there, (and you use anonymous access) you’ll have to give IUSR permission to the folder where it is.

I just set it up and it works great and was very easy to do.  If you work in an environment where port access requests are difficult and a web server already has HTTP ports open, this might be a good solution.


Getting Connected

May 22, 2008

I’ve been spending a little time getting my online presence all linked together.  Twitter feed showing on my blog, Twitter updating my Facebook status, FriendFeed hooked up to Twitter, Facebook, and Yelp!…  I’ve also started to invest some time in keeping Twitter up to date and this has started to return rewards for me.

I’m really starting to get into being connected to folks via the internet.  At one point I thought that computers and networks were driving people apart – instead of going down to someone’s desk to ask a question, send an email.  But I’m discovering that once you plug into a network of people on line be it with Twitter or Facebook or Yelp or wherever, these services are allowing us to become more connected with people around us – people geographically close and people far flung.

It takes some work, but the investment in maintaining a social presence online until critical mass is met, is to me well worth it.

TDWI Salary Survey

May 21, 2008

Curious if your are making what your peers are making?  Teo Lachev points us to the TDWI Salary Survey that plots salary with title and responsibilities.

Avoiding the Annoying in The File Connection Manager

May 19, 2008

Let’s say you want to delete a file as a part of your SSIS package and let’s say that the file is created during the executions of the package.  Simple – you use the File System Task.  You just add the task to the package and create a file connection manager to point to the file…only the file doesn’t exist yet.  The file connection manager editor won’t let you leave the file name pointing to a file that doesn’t exist.

Simple fix – All you have to do is use the properties window in SSIS instead of the dialog that comes up when you double click the connection manager.  Doesn’t look like DelayValidation is required here.

Using Twitter to Connect with SQL Folks

May 19, 2008

Jason Massie (statisticsio) has written a post on the SQL folks he follows on Twitter.  It is really great to know who everyone is following so that we can foster a SQL-Twitter community.  Thanks Jason.

You can get to my Twitter stream over on the right hand side of this blog…


MS Live Mesh

May 15, 2008

I was just accepted to the Live Mesh beta and I have to say that it seems pretty cool.  I have installed the Mesh client (should I call it a client?) on my laptop and have used the online “Live Desktop” on another computer and they sync quite well.

After adding a file on one, it shows up on the other within a few seconds.  One computer is on a fast wired network (using the webtop client) and the laptop is connected to the internet by a 2.5g wireless phone.

I can think of a few different scenarios in which I’ll use Mesh and I’m interested to see where Microsoft takes it.  I’m interested to hear of ways that others are using Mesh…


Duplicate ErrorCode Columns

May 2, 2008

A colleague of mine was selecting data from a table that contained a column called ErrorCode.  The OLE DB Source in SSIS was giving an error that said:

The package contains two objects with the duplicate name of “output column “ErrorCode” (13)” and “output column “ErrorCode” (279)”

She was having trouble figuring out why it was complaining since the table she was pulling from had only one column named “ErrorCode”.

If you remember, data transformations can add columns to the data flow that describe error conditions.  One of those columns is “ErrorCode”.  This was the problem.  SSIS wanted to reserve ErrorCode for the column that it adds and that conflicted with the column that was in the table.  I wasn’t able to make this happen when I added a column called “ErrorCode” to a select statement – only when it was in the table from which the statment is selected.

We went to the source table and changed the name from ErrorCode to something else.  This fixed the problem.  The other way to fix it is to use the AS clause in SQL to change the name in the source query.  Changing the output name in the Columns tab in the OLE DB Source didn’t fix the problem so you’ll have to go a little deeper to resolve the conflict.