Transfer mode set to Awesome http://transfermodeawesome.posterous.com Dorkly bits on Exchange, Windows, software and whatever else I can think of. posterous.com Wed, 14 Dec 2011 09:30:08 -0800 Using sp_change_users_login ‘auto_fix’ to fix user/login http://transfermodeawesome.posterous.com/using-spchangeuserslogin-autofix-to-fix-userl http://transfermodeawesome.posterous.com/using-spchangeuserslogin-autofix-to-fix-userl http://rainstorms.me.uk/blog/2010/02/21/using-sp_change_users_login-auto_fix-to-fix-userlogin/

-- Shared with Google Share Button

When moving a database between sql servers, the login information for the server is not included in the move.  This results in databases that have users defined but no associated logins, making them orphans.

This article details a stored procedure to fix the issue, "sp_change_users_login 'auto_fix', 'username'"

Permalink | Leave a comment  »

]]>
http://posterous.com/images/profile/missing-user-75.png http://posterous.com/users/hgWhd9Ir0seDw Ted Lilley lilleyt Ted Lilley
Mon, 26 Apr 2010 16:51:00 -0700 Free SQL Profiler http://transfermodeawesome.posterous.com/2010/04/free-sql-profiler.html http://transfermodeawesome.posterous.com/2010/04/free-sql-profiler.html

I'm using a SQL database from my workstation, which only has the Management Studio installed but the the profiling tools (which don't come with the stand-alone management studio install).

I need to run a trace on the server, but I don't have full access and I don't want to go through the full SQL installer just to get the profiling tools, even if I were to figure out how to do that.

I stumbled across this open-source profiler for SQL Server 2005/2008 Express which looks like it might do the trick:

http://sites.google.com/site/sqlprofiler/

Looks like it does the trick.  I had to run as a user with administrative privileges (access to sys.traces is key), so your mileage may vary.

It's not a very sophisticated tool.  For example, I didn't see a way to save the results of my trace.  That's a big annoyance.

The filtering and event system is fairly straightforward.  I had to identify the database in which I was interested, as well as the events that looked like they might have anything to do with the queries which I was hunting...commits, sql statements, stored procedures and batches looked like they might be events I was interested in.  In the end, this gave me a lot more information than I wanted, which I then had to comb through in the program's UI.  If I had a better idea as to what the events meant, I could probably cut down the volume of info.  Dumping to CSV would allow me to use my favorite data dicing tool, Excel.

But, all in all, I was able to get the information I was looking for, so I appreciate a free tool.  Perhaps it can do you some good as well.

Permalink | Leave a comment  »

]]>
Fri, 02 Oct 2009 20:42:00 -0700 SQL Server Performance Monitoring http://transfermodeawesome.posterous.com/2009/10/sql-server-performance-monitoring.html http://transfermodeawesome.posterous.com/2009/10/sql-server-performance-monitoring.html

While working with customer recently, we encountered SQL issues pegging the CPU.  As it turns out, a certain query was causing this performance issue.  However, it was difficult to see that without the use of some SQL management tools.  This is a brief article about some of those tools.

The real key to seeing what was going on was a combination of the Task Manager and SQL Management Studio Express Edition.  Task Manager was needed to show when the performance hit was taking place, and SMSEE was used to show what the active SQL process was doing.

In SQL Management Studio, connect to the database.  Then go to the Monitoring folder in the tree view.  This is the bottom folder.  Under that folder is the Activity Monitor.  Right-click on this and choose to monitor active processes.

You will then see a table of all of the processes connected to the database, including management studio.  This view shows which connections are actively processing and a summary of what command SQL is currently processing for that process.  Note that it is a static view and needs to be periodically refreshed by the Refresh button.

By right-clicking on the active process, you can go to a detail view that shows the full query which is currently being executed.  This view is also static and must be refreshed.

When the CPU was taking a big hit in Task Manager, we would look at the detail view to see the query.  It was a single query which didn’t change for the duration of the CPU activity, so we knew that was the one causing the problems.

Another tool are the performance report dashboard for management studio.  This contains a number of useful canned reports for looking at average and instantaneous top resource consumers (queries) for your database.  These are extremely useful and are available (free) from Microsoft at: http://www.microsoft.com/downloads/details.aspx?FamilyId=1d3a4a0d-7e0c-4730-8204-e419218c1efc&displaylang=en

There is also the SQL Database Tuning Advisor built into SQL Management Studio (not the Express edition), but it only works against SQL editions greater than Express.  One method of employing this would be to convert one of our test machines from a SQL Express to a standard edition SQL Server and using it with our product long enough to generate the profile data used by the advisor to suggest optimizations.

Another set of tools is the SQL Health and History Tool as well as the SQL Best Practices Analyzer.  While I haven’t had time to investigate these yet, they sound useful as well, especially the BPA.

Permalink | Leave a comment  »

]]>
Thu, 01 Oct 2009 14:51:00 -0700 Evil, thy name is AUTO_CLOSE http://transfermodeawesome.posterous.com/2009/10/evil-thy-name-is-autoclose.html http://transfermodeawesome.posterous.com/2009/10/evil-thy-name-is-autoclose.html

Okay, perhaps that’s overstating the case a bit, but on the other hand, perhaps not.

AUTO_CLOSE is the feature in SQL which will “close” a database (ie, clear the cache, write everything to disk, release the file lock, etc, etc) some 300ms after the last connection to the database is closed.  It is turned off by default with all editions of SQL except for SQL Express, which forces it to on.  This is ostensibly to ease XCopy deployment, according to MS.

A customer called with performance problems.  We use SQL Server 2005 Express Edition as part of our product, and we are constantly making single-use, non-overlapping connections to the database.  In this customer’s case, every time a new connection was made to the SQL Server, it would sit and chew on the CPU for a good couple minutes as it was processing.  This is not the normal behavior we expect to see.

So we looked at the usual things.  The two most notable things about his setup, aside from having a database on the large end of things, were that he had a configured memory constraint on the database at around 100MB and that there were a lot of Application Event logs saying “The server resumed execution after being idle %d seconds. This is an informational message only. No user action is required.”  These were accompanied by reports of the clearance of the cachestore.

As far as the memory constraint goes, we had set that manually at some time earlier.  Since the client had increased the system memory in the interim, I put it back at the default, which is somewhere around 2TB.  This is SQL Express, so it still has a 1GB memory limit coded into it somewhere else.  While this improved performance, it was only about a fifth faster.

I was also already familiar with the Application Event log message, although I was puzzled as to why it was there.  We encountered AUTO_CLOSE when we first started using SQL Express, since only SQL Express sets AUTO_CLOSE to True.  Our customers had been chewing our ears off about their log files filling up with useless messages (why, if the message states “No user action is required”, can you not simply disable that log message?).  We dutifully relayed the fact that it was a “harmless” message, but for this reason alone we began configuring the database to not AUTO_CLOSE.  As we find out now, that is not the most important reason to turn off AUTO_CLOSE, by any means.

So there were three puzzles here.  Why were those messages occurring if we don’t configure AUTO_CLOSE, why are there cachestore messages, and are either of them related to the performance problem?

The cachestore messages are the easiest to clear up.  They are a new, related message spit out by the same event, added with SQL SP2.  So that’s where they are coming from.  So at least MS is now telling us there is more to the picture than “No user action is required”.  Harmless?  Apparently they are now telling us there is a cost associated with the automatic closing of the database.

Further research showed that there are two “naughty” settings for performance with SQL Server.  AUTO_CLOSE and AUTO_SHRINK.  Technet states that best practice is to turn these both off.  So Express doesn’t follow best practices.  AUTO_CLOSE is even slated to be removed entirely from SQL server in the future!

As I mentioned, we knew that already, although we didn’t know the associated performance issues.  Now we knew that as well.  But if we didn’t have AUTO_CLOSE on, why were we seeing these things?

It’s a combination of things.  First of all, there is apparently no way to force the SQL Express instance, as a whole, to default that setting to off.  If there were, you could set that and be done with it, at least until you installed a new instance.  Then you’d need to set it off there as well.

Or, if the setting were a permanent part of the database, you could set it there and be done with it.  Microsoft’s documentation would lead you to believe that this is the answer.  In fact, this is the only way to set the attribute, on the database itself.

So that’s what we do.  However, there is a detail.  Apparently this setting isn’t actually set within the database, it is set somewhere more ephemeral.  You can tell because if you detach the database and reattach it, that setting is lost and defaults to True again!  We actually do that procedure quite a bit because we troubleshoot not by operating on live customer databases but rather by following Microsoft’s stated instructions on moving databases between instances, which is to detach and reattach.  There are other processes which require this as well, such as moving the database between customer servers.

Now, if the setting were part of the db, that wouldn’t be a problem.  But apparently it isn’t, because the re-attach resets the AUTO_CLOSE attribute to TRUE.  Blech.

So the answer is to reset this attribute every time we do the kind of support that requires us to copy the database.  It also means we have a lot of customers out there who have unwittingly had that setting changed by our troubleshooting procedures.  Thanks MS!  Sorry for the sarcasm, this is just one particularly tricky set of circumstances that don’t seem to be well thought out.  I am not above that either.

Fortunately or unfortunately, the result with the customer is that they did not see an enormous increase in speed with this change, so it apparently isn’t the major culprit I thought it might be in the first place.  Still, it would be nice for SQL Express to have a better story on this.

Permalink | Leave a comment  »

]]>
Thu, 13 Aug 2009 00:10:00 -0700 Moving Redmine Wiki pages and Connecting to the Bitnami MySQL instance http://transfermodeawesome.posterous.com/2009/08/moving-redmine-wiki-pages-and.html http://transfermodeawesome.posterous.com/2009/08/moving-redmine-wiki-pages-and.html Apparently, you can’t move wiki pages between
Redmine projects like you can with Redmine issues.
I found this out because I was trying to rename a project’s URL in Redmine, but once the identifier is assigned, I guess you can’t do that either.  Or perhaps you can using a similar method to what I discuss here, I just didn’t think to go back and try mucking with the ID directly in MySQL.
In any case, the next best thing is to go into the MySQL tables and reassign the wiki_id of the pages you want to move.  So I made my new project, then tried to find a client with which to manipulate the tables.
I’m running Bitnami’s Redmine 0.8.4 on Windows, and after a little while I came across the MySQL GUI Tools Bundle.  They have a nice Windows installer, which worked fine.
The only trouble I had initially was getting the credentials.  I didn’t have any network issues, as I loaded the tools directly on the same server.  When it asked me for the connection details, I found I had to look up the username and password.  These are stored in Redmine’s database.yml file, under the production instance, which is located in Redmine’s config directory.
The GUI Query tool complained that I didn’t specify a default schema, mainly because I had no idea which one’s were available.  Apparently Bitnami gets in on the act here, since once I was able to open the database I could see the schema name is actually “bitname_redmine”.  This was ok by me.
The last piece was actually getting the identifiers and issuing the update.
From the “wikis” table I was able to get the id (not “project_id”) of the wikis in question.  I had to do the select manually since there didn’t seem to be a simple way to make the tool show me the table contents automatically.
Then the query was something like:
UPDATE wiki_pages
SET wiki_id = <new>
WHERE wiki_id = <old>
Executing the query worked, and then I was able to list all of the pages under the index of the new Wiki through Redmine.  One last touch was to copy and paste the contents of the old start page to the new start page manually.  Then I deleted the old start page and I was done.

Permalink | Leave a comment  »

]]>
Tue, 11 Aug 2009 21:43:00 -0700 Testing SQL queries without a database http://transfermodeawesome.posterous.com/2009/08/testing-sql-queries-without-database_11.html http://transfermodeawesome.posterous.com/2009/08/testing-sql-queries-without-database_11.html

I recently devised a way to test SQL queries without the benefit (or headache) of an actual database. This works on the simple cases I need.

I’m big on code testing and test-driven development in general. I also write a lot of small utilities for my company’s products that perform SQL queries. So naturally I went looking for a simple solution to test the results of the queries.

The tests generally consist of an input set of data, a query and a result set of data or result status. There tend to be lots of little queries to test. The cases (or datasets) that are of interest don’t need to be complicated or large, they just need to represent the problem at hand, usually just a few rows. Additionally, the queries want to operate on the same tables, but don’t want to interfere with each other. So the need is for a query to be tested against different datasets representing the same database and to do so in isolation.

A regular database has the benefit of parsing and performing the query as intended, but also has the baggage of the system. In order to change datasets, you either need to inject the data into the database via a query or you need to detach a database file and attach a file with a different dataset (we won’t consider running multiple database instances). It is also difficult to capture the initial dataset for each test case. The system has to be maintained and kept running, etc.

It would be nice to have an easier method of generating and querying datasets, preferably text-based. Getting rid of the baggage of a running database system would be nice as well.

The solution I found does this, but it has limitations.

As for the data format, CSV works fine. The tabular form used by databases lends itself to a format which works in Excel. As Excel is one of my favorite general-purpose tools, CSV is both powerful and accessible.

As for how to access the data, Microsoft provides a built-in ODBC driver for Text files. By using ODBC to run the query, you can run a real Microsoft SQL query against a text file as a database, provided you do some setup and that you are only trying to read the database. Updates and deletes are not supported by the Text driver.

I need to re-edit this post to provide the exact details, but here are the broad strokes:

  1. Using the ODBC Data Sources tool, create a File DSN that specifies the format of your database. Specify the file that represents you table and make sure the format (CSV) and the column headings are defined as fields. This generates a DSN file (I named it txt.dsn) and a schema.ini file. I put these both in the test directory so they get checked into version control and propagated to the build server.
  2. Put the data for a query into the CSV file.
  3. Generate the SQL query that you are testing and, if the test requires a result set, the expected result set.
  4. Use a tool that provides ODBC access (in my case, Python with pyodbc and py.test) to write a test which compares the actual result of the query to the expected result.

The SQL query which you are testing will need to refer to the table by the actual filename, with the “.csv” filename. Since I’m testing real queries, I have to rewrite the table references in the query to include the proper filename.

Because of the limitations of the text driver, you can test simple select queries (I haven’t tested joins), but I don’t think you can handle more complex things like stored procedures, selecting to tables, variables, etc. etc.

For my purposes, I only need to be able to have and switch between multiple small datasets, which is cake with this system. Put your datasets into CSV files, make sure the schema.ini file gets updated with field definitions (using the ODBC data source tool) and direct your queries to the filenames. Beats a real database hands down, while letting you test SQL queries with features like subqueries and aliasing.

You can also keep result sets in CSV files and use Python’s CSV access to compare result rows. In order to be independent from ordering issues with result sets, I convert the actual and expected result sets to Python sets before comparing. Converting to sets also requires that the rows themselves (both actual and expected) be cast to tuples so they may be hashed as set members.

Wrinkles

You can’t update records and you can’t delete them. This is essentially read-only. I test these kinds of queries by converting them to SELECT statements where possible. INSERTS are supposedly supported.

When comparing results from ODBC in Python versus CSV, blank fields come across as ‘’ in ODBC and None in Python. A quick list comprehension converts one to the other.

Whitespace in fields is also a comparison issue. Either make sure fields are stripped of whitespace in your CSV, or do whitespace stripping in your code.

Permalink | Leave a comment  »

]]>
Tue, 11 Aug 2009 21:42:00 -0700 Using aliases on table names in SQL DELETE statements http://transfermodeawesome.posterous.com/2009/08/using-aliases-on-table-names-in-sql_11.html http://transfermodeawesome.posterous.com/2009/08/using-aliases-on-table-names-in-sql_11.html

I ran into a situation where I needed to use a correlated subquery to determine the rows I wanted to delete from a table.

Unfortunately, the table on which I was running the delete was also the one from which I needed matching data to identify the rows. I had to disambiguate the table references of the inner query by using aliasing.

A SELECT (as opposed to DELETE) version of the query is something like:

SELECT * FROM MyTable AS LEFT

WHERE LEFT.Side = ‘LEFT’

AND NOT EXISTS (

SELECT * FROM MyTable AS RIGHT

WHERE RIGHT.Side = ‘RIGHT’

AND RIGHT.ToLeft = LEFT.ToRight

)

For the moment, don’t focus on what the query is trying to accomplish. Just notice that this SELECT statement relies on the aliasing of the outer table reference ("LEFT”) to disambiguate the inner query. The issue at hand is that when you change this to a DELETE statement, the syntax for the essential aliasing becomes tricky.

DELETE FROM MyTable AS LEFT …

doesn’t work!

After doing a bit of research, I found that it is possible to alias a delete, but the syntax is a bit non-intuitive. The following does work:

DELETE LEFT FROM MyTable AS LEFT …

Score one for the syntax police.

Permalink | Leave a comment  »

]]>
Tue, 28 Jul 2009 15:28:00 -0700 Querying SQL with LINQPad http://transfermodeawesome.posterous.com/2009/07/querying-sql-with-linqpad.html http://transfermodeawesome.posterous.com/2009/07/querying-sql-with-linqpad.html I don’t remember how I came across LINQPad, but I’m glad I did.
In my work I deal with SQL databases quite often.  My company’s product is based on SQL 2005 Express Edition.  There are many times during troubleshooting a customer issue that I’d like to be able to look at what’s going on in the database.
Until now, I only had two options.  The first is to load SQL Management Studio Express Edition (SMSEE).  While this is a great tool, unfortunately it’s also rather cumbersome for taking quick peeks.  Not to mention the fact that for most customers, the server in question is their primary production machine, which means they are rightly fairly resistant to installing software of any kind.
The second option is the command-line sqlcmd tool which is included in the SQL Server install.  Until now, this has been my primary tool for issuing SQL queries.  It has some important drawbacks, aside from not being a GUI.  The biggest drawback is that, while you can redirect input and output to a file with command-line options, you cannot do so within an interactive session.  So all of your command history goes out the window if you want to save results to a file, plus you have to reissue all of the commands to get the session in the proper state for the command in question.  Additionally, if you redirect output to a file, the output (including errors) no longer shows up in your interactive session, so you can’t tell if you’ve issued an incorrect command along the way without consulting the output file.
Enter LINQPadLINQPad is a lightweight (3MB) graphical database client.  I say database client rather than SQL client because its native mode is to process LINQ queries rather than SQL.  While I know very little about LINQ, it does process SQL queries as well, so it is fine for my purposes.  In fact, it provides what looks like will be an excellent tool for learning LINQ while still providing me the go-to SQL tools when I need it.
LINQ is touted as a next-generation query language, based on SQL to a great extent but also simplified and integrated directly into Microsoft’s .NET language framework.  I’m not sure of all of the ramifications of this, but SQL could sure use some simplification as it seems to me a bit long in the tooth.  I’m sure there are some easy wins in a new, standard query language that takes advantage of the ideas of the last twenty-some-odd years of development advancement.  See the LINQPad page for more info on LINQ.

Using LINQPad

Using LINQPad is as easy as firing it up.  Of course, you have to download it first (at my work, we put it in a bundle of diagnostic tools our technicians can download as a bundle).  There is a standalone .exe as well as a Windows installer available.  This is a Windows-only application, by the way.
The user interface looks a bit like SQL Management Studio with a database structure pane on the left in a tree view, query and results on the right.
You have to select your query language from a dropdown at the top, which defaults to C# LINQ. I switch it to SQL at the moment.
Adding a connection to a database is as easy as clicking the “Add a connection” link at the top of the database pane.  For most purposes, the default connection settings work fine.  For my database, I had to remember that I have a non-default instance name.  This caused some confusion for more than a few minutes because I’m so used to the instance name used by our product that the difference didn’t even register.  No slight to LINQPad, that’s just my own brain not functioning.
I also believe that you don’t have to be running LINQPad on the same machine as the database server (this is pretty important usually), but since I’m always on my database server I didn’t test this.
Once connected to the appropriate instance name (duh, me), issuing queries is pretty basic.  Compose the query in the query window, press play and look at the results.  You can inspect the database structure in the tree in the database pane.  Very simple, very nice.
Kudos to Joe Albahari, the author of this tool.  He was also responsive in the LINQPad forum when I logged my troubles.  Joe is also the author of C# 3.0 in a Nutshell from O’Reilly.  LINQPad comes with all of the examples from that book.  Good job!

Permalink | Leave a comment  »

]]>