Migration of a large enterprise from SQL Server 2000 to 2014. Real life case.
This is a first article in a series of articles about real life database and IT infrastructure migration project which has been executed for a large US company which services are heavily dependent on fast and reliable database infrastructure. In this article we`ll give a brief description of the project, goals and actions taken. In further articles we`ll get deeper into details of each step of migration.
Our client`s IT infrastructure has been running since 2000 without any major upgrades. Since the hardware is rather old, a couple of new servers have been added but the majority have been running for more than a decade.
Development and administration process was a kind of ad-hoc, without test environment and proper source controlling. All this facts, growth of business and load on the systems resulted in degradation of performance and reliability of the system which put the business at risk.
Our aim was to migrate company database infrastructure to the new data center and new servers running up to date software in order to improve performance, scalability and reliability.
Highlights
- 25 servers
- 105 databases
- Largest databases are about 1 Tb size
- OLTP and OLAP
- 400 SSRS reports
- 6000 DTS packages
- 100 SSIS packages
- Complicated Replications, transactional and snapshot
- COM libraries, sometimes without source code
- Source environment SQL Server 2000, SQL Server 2012, Windows Server 2003 , 2000
- Destination environment SQL Server 2014, Windows server 2012
New environment
- 2 SQL Server clusters
- 1st for read write operations
- 2nd for read only operations
- SSIS and DTS Server
- Reporting server
- Replication distributor server
Migration steps
Database migration
- Backup
We used a stored procedure which backed up each database into four backup files in four threads. It allows to achieve better performance and decrease the time of backup.
- Restore and upgrade compatibility level
Since SQL Server 2000 databases can`t be restored on SQL Server 2014 due to compatibility level difference we used SQL Server 2008 as an intermittent server to increase compatibility level.
- Backup
After increasing compatibility level we backed up the databases again and put into zip archive with 700 mb volumes in order to have a good network throughput.
- Copy
Our environments were both situated behind VPN. So we setup FTP server in the new Data Center and copied archived databases via FTP.
- Restore
We restored databases in the new data center using our custom made stored procedure which accepts a path to the folder with backups as a parameter and restores the whole bunch of backups automatically.
Jobs migration
This is the step which requires the most part of manual and analytical work
According to our analysis certain part of the jobs was legacy and didn`t need migration at all.
For the rest we did the following:
- Script
Automatically scripted out all the jobs
- Replace parameters
Replaced logins, file paths etc with new values
- Roll out to destination
Created jobs in destination location
SSRS migration
Here is our guide to SSRS migrations.
Logins
- Script out
To script out logins we used special custom made stored procedure. It is important to keep the mapping of logins to the databases and this was achieved by the procedure.
- Create from script
All windows logins need to be filtered out as they are specific for particular environment
- It is important that SIDS should match
SID in restored databases and recreated login must match each other otherwise the access with given login will be broken.
COM objects
The trickiest part of the whole migration. Some of COM objects were using legacy code not supported in SQL Server 2014. It was ok if we had the sources of those COMs. In such case we simply changed the code and recompiled them. But in case of absence of the sources we had to recreate the logic from the COM in the following way.
- Catch all the actions being done by the COM library via SQL Server Profiler
- Disassemble COM libraries and research the code we got.
- After previous two steps we`re able to recreate the code of the COM libraries either in .NET or even in stored procedures.
DTS packages
During this phase of migration we decided to migrate DTS packages to new location and postpone their upgrade to SSIS until next phase of upgrade as there are already a lot of changes done. Having 6000 of DTS packages made it almost impossible to transfer them manually so we created a set of stored procedures to modify the DTS packages. Procedures were used to change the connection strings and paths in the DTS packages to fit the new environment.
After then we copied all the DTS packages from MSDB to another database and transferred it to new environment. There we inserted all the packages to MSDB. Unfortunately we had to run SQL Server 2005 on separate server to keep compatibility with DTS. This is planned to be further upgraded to SSIS and put into SQL Server 2014.
Replication
Most part of the replications was not needed in the new environment as the number of servers has decreased a lot and in the past those replications were used just to balance the load between old weak servers. in the new environment we created all necessary replications from scratch and setup a monitoring system to be notified of any issues with replication.
Setup maintenance and monitoring
In the new environment we setup all our standard monitoring and maintenance routines to keep the servers and databases in a good shape.
Results
- Overall system performance has grown by more than 1000%. End users of the system can now work with the system in real-time and not wait for calculations results as in old environment. Finally it results in higher profits for the business.
- We used automation at almost all steps of the migration. It allowed to decrease the time of migration and increase reliability of the process.
- There are still a few big steps to be taken. Such as refactoring of legacy code and upgrade of DTS to SSIS.
As we mentioned in the beginning of this article we plan to post a separate article about each step of migration and provide all technical details. Please make your suggestions about what topics should be covered first.
Utilizing given approach each client get his own database. In such case every client`s data stays isolated from others. It simplifies backing up the data, installing new clients but causes higher cost of maintenance.
MoreSubject recursion is well covered in the literature, but, nevertheless, the problem of output “tree” does not mean the client and SQL Server many baffled. So, put the problem: there is a table with the name and record id field indicating the parent identifier. Immediately fill in this table, some sort of test data:
MoreIn our time of greatest prevalence of databases were relational databases, which are the main targets of the tables and the relationships between them. Tables can solve most problems for data storage and manipulation. But in the real world entity requiring storage is not always presented in a tabular form. One of these very common types of data structures other than the table is a tree structure, where each data element is the parent and the offspring. An example of such a structure may be the structure of state enterprises, which is headed by the director (the root of the tree), his deputies, heads of departments from which are subject to certain deputies, employees of departments, which are subject to the rulers.
MoreSubscribe to our blog
Utilizing given approach each client get his own database. In such case every client`s data stays isolated from others. It simplifies backing up the data, installing new clients but causes higher cost of maintenance.
Subject recursion is well covered in the literature, but, nevertheless, the problem of output “tree” does not mean the client and SQL Server many baffled. So, put the problem: there is a table with the name and record id field indicating the parent identifier. Immediately fill in this table, some sort of test data:
Discuss your project with us