Thursday, June 19, 2008

Putting New Data into SQLSpyNet with the DTS Wizard

The DTS Wizard is a very flexible tool that allows us to import data from almost any data source, including Excel, DB2, Oracle, Access, and even Text files. Why do we need this? Many organizations have data spread throughout the company. Charles will have a spreadsheet of his customers, Mary will have a small Access database that has all her suppliers, and Jenny, the secretary, might have a text file (word document) with the names and phone numbers of all the staff members.

With all of these disparate types of information floating around the office it's pretty obvious that data management can become a real nightmare! Duplication of data is inevitable, and retrieval is almost impossible.

Internet 2010

But here comes SQL Server 2000 to the rescue! With the introduction of DTS in SQL Server version 7.0, many organizations have been able to relieve the pain of trying to get all these disparate pieces of information together. The DTS Wizard is a point-n-shoot approach to import (or even export) data into our database. We go through a series of steps, selecting the database into which to enter the data, where the data is to come from, and so forth.

This is all fine for simple data, but what happens when we have more complex data that doesn't have a clear definition between first name and surname? DTS can even take care of this, with your help of course.

Within a DTS package we can define VBScript that can be used to manipulate, format, or rerun a process. This gives us great flexibility when it comes to altering our data either before, during, or after the insert to the database.

There are many aspects to DTS packages, and we will only have a quick look at the basics. I suggest that you play with DTS packages in SQL Server 2000 because they rock! And the flexibility and control you have over the import/export process will surprise you.

Before we actually insert the data into our database, we are going to delete all the data out of our database. This will allow us to start from a clean slate. But just before we do this though, let's back up our database.

Backing Up the Database Before the Transfer

You might be wishing you didn't have to do this, but do you want to know the cool thing? You do not have to write the backup statement again! Because we wrote a backup task earlier (see Chapter 11, the section titled "Scheduling Jobs"), we can actually force it to run immediately. This means we can create a full database backup just by right-clicking on the job (under Management, SQL Server Agent, and then Jobs), and selecting Start Job.

You will see that the status of the job is set to executing. When the job has finished, the status will change to either Succeeded or Failed.

So there we go, a backup done nice and simply, with no extra code!

Monday, June 16, 2008

Debugging Stored Procedures

I have performed many tasks in numerous different roles, and one of the most frustrating has been debugging stored procedures. But no more! The ability to debug stored procedures as though we were debugging any development platform code is part of one of the enhancements to SQL Server 2000's Query Analyzer. We can insert break points, step into, step over, and so forth. This is wonderful for those of us who have tried to monitor what is happening in a stored procedure.

Previously we could do this with SQL Server 7.0 and Visual InterDev, but there was a lot of overhead in setting it up. With the new debugging tools, all we do is right- click and select Debug. How simple is that?

We have the standard debugging windows as well. We can get the values of variables from the Watch window and view the procedures that have been called and not completed in the Callstack window. This makes it easy to migrate from a development environment to using SQL Server 2000. So you Access developers out there must be getting really excited by now!

Internet 2010

What? No More Room?

One of the trouble spots a DBA must keep an eye on is conserving a computer's most precious resources, memory and disk space.

If we have several databases on one server, we can find that we run out of disk space, and if that happens, our databases will fail.

Of course, in Spy Net's fictional scenario, that could mean World War III! But in the real world, running out of space can still cause serious problems, especially in mission-critical databases such as utilities or emergency response systems. In this section, we look at the causes of resource failure and several ways to avoid down time, including managing file and log size.

How Memory Affects Database Transactions

The memory-deprived databases will fail because tempdb is where most of the changes that you make to your data are performed before they are committed to disk. If you have enough RAM available, SQL Server 2000 will put as much of your database as it can up into RAM. After all, it is much faster to read from RAM than to scan a disk for the information. However, if RANI is a short commodity or you have concerns about the amount of disk space your database is eating, relax, because we have even have control over that.

When you are in Enterprise Manager you have the option to view how much space your data files for your database are allocated and how much is used. To see this information, simply click the SQLSpyNet (or any other) database within Enterprise Manager, and you will see a screen.

Shrinking Your Data Files to Reduce the Database

When we are talking about shrinking our data files we are not actually referring to the process of compacting them like a zip program would.

If we shrink our data files, we remove unused data pages. For example, if we had a table that had five data pages on which it stored the data, and we deleted two pages worth of the data, although our table would have only three pages that actually stored data, SQL Server 2000 still would have five pages allocated to the table.

When we shrink the data files, we just get rid of the extra two pages that the table was using. This does, however, have restrictions, but I think you get the idea.

What do we do when our data files are too large? Although we cannot shrink an entire database smaller than its original size, we can shrink our data files smaller than their original allocation sizes. We must do this by shrinking each data file individually by using the DBCC SHR I NKF I LE Transact-SQL statement. This allows us to reallocate how much space the given data file is allowed to use.

Internet Blogosphere