The importance of automating processes can't be overestimated. Most things start out as a manual procedure until you work out exactly how things should happen, but many times they stay that way permanently. What's worse is that this manual process usually relies on one person who knows the magic sequence. This means that something as trivial as a sick day can result in a system or data problem.
If you've got several sources of data that you need to get into a database, you don't have to get too fancy to automate the process. Good old FTP and flat file import will do the trick reliably and never take a day off.
Here are the basic steps to automate an FTP download in Windows followed by a CSV insert into MS SQL Server.
Step one is to fetch the latest data from another server via FTP. In Windows you can accomplish this using a script to send ftp commands. A simple example of an ftp script that fetches a single file looks like this:
You would save the list of FTP commands into a file with a .scr extension, like CSVDownload.scr
A full list of available commands can be found here.
Next you need a batch script to execute the ftp commands:
Save that batch script with a .bat extension, like GetCSV.bat
Executing that batch script will open up an ftp connection and fire off your commands resulting in filename.csv being downloaded. The next step to automating that process is to add a simple windows scheduled task that executes your .bat file at whatever interval makes sense.
Flat File Import into SQL Server
Now that you've automated the fetching of your flat file via FTP, the next step is to automatically import that file into SQL Server.
To do this, we'll use an SQL Server Agent Job. Using SQL Management Studio, create a new SQL Server Agent Job:
Fill in the general details and then go to the Steps page to add a couple of steps. The first step (if applicable) is to perform any data cleanup you might need. You might want to truncate the existing table before you import the new data or make some other changes. If you don't need to do that, then just create the next step which is to actually import the data.
Depending on your data schema this isn't always super straight forward. One of the most flexible ways I've found to do this is by creating a #Temp table to bulk insert the data into first, then move that data into my normal data table from there before finally dropping the temp table. That code looks like this:
CREATE TABLE #dataTMP (field1 datetime,field2 int, field3 int,field4 int, field5 int, field6 int,field7 decimal(4,2),field8 decimal(4,2));
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
INSERT INTO dataStationDay(field1,field2,field3,field4,field5,field6,field7,field8)
SELECT * FROM #dataTMP
DROP TABLE #dataTMP
You'll need to change the table schema to match your data of course, but this SQL script will do the job for many flat file scenarios, just edit the start row, field terminator, and row terminator as needed.
Next, click on the Schedules page and add an import schedule that repeats on the same interval as the FTP task, but delayed by some amount of minutes to make sure that the FTP fetch has completed.
Configure notifications or alerts as desired and you're all set! Just make sure that the SQL Server Agent is enable and set to Auto Start or your import will never fire.