background preloader

SQL Server

Facebook Twitter

Arrays and Lists in SQL Server 2008. An SQL text by Erland Sommarskog, SQL Server MVP. Latest revision: 2012-07-01. Introduction In the public forums for SQL Server, you often see people asking How do I use arrays in SQL Server? Or Why does SELECT * FROM tbl WHERE col IN (@list) not work? The short answer to the first question is that SQL Server does not have arrays – SQL Server has tables. Upto SQL Server 2005, there was no way to pass a table from a client, but you had to pass a comma-separated string or similar to SQL Server, and then you would unpack that list into a table in your stored procedure.

This changed with SQL 2008. The examples are in C# and VB .NET, using the SqlClient API, and the main body of the article covers this environment. There is an accompanying article: Arrays and Lists in SQL Server 2005 and Beyond (and an even older for SQL 2000) where I in detail describe various methods to pass a list of values in a string and unpack them into a table in SQL Server. Contents: Background But when they test: ... Arrays and Lists in SQL Server 2005. An SQL text by Erland Sommarskog, SQL Server MVP. Latest revision: 2013-11-12. Introduction In the public forums for SQL Server, you often see people asking How do I use arrays in SQL Server?

Or Why does SELECT * FROM tbl WHERE col IN (@list) not work? The short answer to the first question is that SQL Server does not have arrays – SQL Server has tables. However, upto SQL 2005 you could not specify a table as input to SQL Server from a client, but you had to pass a string with the values and unpack it into a table on the SQL Server end. This article describes a number of different ways to do this, both good and bad.

SQL 2008 added a new feature that evades this kind of kludge: table-valued parameters, TVPs. Furthermore, there are two performance-test appendixes. If you feel deterred by the sheer length of this article, you should be relieved to know that this is the kind of article where you may come and go as you please. Note: all samples in this article refer to the Northwind database. SQL Server Batch Update. Always we should take care of database server performance but sometimes you need to perform a huge update on your table/tables this will affect performance for concurrent users and they will complain about it. For example if you have a table contains more than billion row and you want to update a column of this data and your database is accessible 24 hours 7 days a week it is difficult to run this code in simple update statement because a lot of data will be fetch to your memory and a lot of user data will be kicked off out of memory.

So the solution is to use batch update by divide your data in small batches, e.g divide it to update 1000 row at a time and stop for a moment and the update the next 1000 and so on, this will let other users to use share the server with you. Sample code: -- SQL update in batches of 1000 SET Price = Price * 1.08 -- 2 second delay. SQL Server - Unit and Integration Testing of SSIS Packages. I worked on a project where we built extract, transform and load (ETL) processes with more than 150 packages.

Many of them contained complex transformations and business logic, thus were not simple “move data from point A to point B” packages. Making minor changes was not straightforward and results were often unpredictable. To test packages, we used to fill input tables or files with test data, execute the package or task in Microsoft Business Intelligence Development Studio (BIDS), write a SQL query and compare the output produced by the package with what we thought was the correct output.

More often we just ran the whole ETL process on a sample database and just sampled the output data at the end of the process—a time-consuming and unreliable procedure. This article explains how to perform unit and integration testing of SSIS packages by introducing a library called SSISTester, which is built on top of the managed SSIS API. SSISTester Mini ETL id,name 1,company1 5,company2 11,company3. CHECKDB From Every Angle: Complete description of all CHECKDB stages.

On the Storage Engine blog last year I started two series that I got distracted from – one on CHECKDB and one on fragmentation. With the benefit of hindsight, I’m going to start 3 series on my new blog here – one on CHECKDB (‘CHECKDB from every angle’), one on indexes (‘Indexes from every angle’), and one on internals (‘Inside the Storage Engine’).

The first few posts of each will be updated reposts of a few from the previous blog, just for completeness. I realize that I promised to finalize the DBCC whitepaper this summer – well, that’s been delayed a little by the wedding and other stuff. Once I’ve done a few blog posts in this series, I’ll tie everything up into a PDF and add a link to it. First up is describing just what CHECKDB does. 1. CHECKDB needs a consistent view of the database. Here’s an example. How could this happen? The easy way to get the consistent state is through locking, which is what SQL Server 7.0 did. There are some problems with this mechanism however. 2. 3. 4. BULK INSERT (Transact-SQL) Applies to: SQL Server (all supported versions) Azure SQL Database Imports a data file into a database table or view in a user-specified format in SQL Server Transact-SQL Syntax Conventions Syntax Arguments database_name The database name in which the specified table or view resides.

Schema_name Specifies the name of the table or view schema. schema_name is optional if the default schema for the user performing the bulk-import operation is schema of the specified table or view. Table_name Specifies the name of the table or view to bulk import data into. FROM 'data_file' Specifies the full path of the data file that contains data to import into the specified table or view. Data_file must specify a valid path from the server on which SQL Server is running. BULK INSERT Sales.Orders FROM '\\SystemX\DiskZ\Sales\data\orders.dat'; Beginning with SQL Server 2017 (14.x), the data_file can be in Azure Blob Storage.

Azure SQL Database only supports reading from Azure Blob Storage. BATCHSIZE = batch_size Note. Ignoring last row of Flat File when doing BULK INSERT SQL Sever 2005: Microsoft, SQL Sever, 2005, Using BULK INSERT from FlatFile. Sql server 2005 - SQL Bulk Insert with FIRSTROW parameter skips the following line. Getting Started with SQL Server Database Unit Testing in SSDT - SQL Server Data Tools Team Blog.

Unit testing SQL Server databases with SSDT is very straightforward, although there are a couple of things to look out for. First you must have Visual Studio 2010 or 2012 Professional edition or higher installed. If you have Visual Studio 2010 Premium or Ultimate edition installed then the new SQL Server database unit testing installs alongside the existing database unit testing from Visual Studio. You can continue to use the old tools but you cannot mix database unit tests created by both versions in the same test project - more on this below.

Test Projects and converting existing Test Projects SQL Server unit tests are created in a normal VB or C# Test project. SSDT introduces a new SQL Server Unit Test template and type. Generating SQL Server Unit Tests from objects in SQL Server Object Explorer If you have a SQL Server database project in your solution you can generate unit tests from any stored procedure, function or DML trigger defined in that project. Configuring the Test Project. SQL Server Profiler – a tutorial | TheFirstSQL. During all kinds of database interaction there are times when you want to see what is actually being sent to the database, the actual queries that are executed by the database engine. Some times you have 3rd party apps that do all kinds of different stuff under the hood, or maybe you have inherited a website from some developer that made a serious spaghetti of everything.

The only way of knowing what is really going on in the database is to use SQL Server Profiler. Now, although profiler is a simple yet *very* powerful tool, it does take some tweaking to make it work they way you want it to. It comes with some default templates but in my experience they are not nearly good enough to be useful in a working environment. There are a whole bunch of objects/events that can be monitored but what I’m gonna do is show you how to make a trace template that will provide the information you need most of the time. Here is the order of things you need to do: First of all you need to open up Profiler.