Over time, I have worked on many databases that have used an INTEGER identity as a primary key or clustered index value. Most of the time there haven’t been any issues, but occasionally, I come across a problem table where the IDENTITY value ran out and inserts statements, then failed. This has typically been in a case of a logging table of some type.

When creating a table, you can specify IDENTITY with the following syntax:

IDENTITY [ (seed , increment) ]


For example:

CREATE TABLE [dbo].[TestIntIdentity](

  [testID] [int] IDENTITY(1,1) NOT NULL,

  [logTime] [datetime] NOT NULL DEFAULT GetDate()

);


Usually, there are a few other columns in the table, but to keep it simple for this example, we will just have 2 columns.

The IDENTITY property is valid for data types of the integer data type category, except for the bit data type, or decimal data type. So how many rows can you insert into that table?  It depends on the data type used.

BIGINT maximum value is 9,223,372,036,854,775,807
INT maximum value is 2,147,483,647
SMALLINT maximum value is 32,767
TINYINT maximum value is 255

What happens when I run out of numbers?  Let’s look at an example using SMALLINT which should have a maximum value of 32,767.

-- SMALLINT should accommodate 32767 values when starting at 1

CREATE TABLE [dbo].[TestSmallIntIdentity](

  [testID] [smallint] IDENTITY(1,1) NOT NULL,

  [logTime] [datetime] NOT NULL DEFAULT GetDate(),

  CONSTRAINT [PK_TestSmallIntIdentity] PRIMARY KEY CLUSTERED

  (

       [testID] ASC

  )

);

GO

 

INSERT INTO TestSmallIntIdentity DEFAULT VALUES;

 

SELECT * FROM TestSmallIntIdentity;

SELECT count(*) FROM TestSmallIntIdentity;

 

GO

 


At this point, we have 1 row in the table. Now let’s fill it almost all the way full by inserting 32,765 more rows.

INSERT INTO TestSmallIntIdentity DEFAULT VALUES;

GO 32765 -- Run it 32,765 times

 

SELECT * FROM TestSmallIntIdentity;

SELECT count(*) FROM TestSmallIntIdentity;

 

Keep in mind that specifying a number after the GO statement, indicates that we want the previous statement to run that many times. In this case, running the insert statement 32,765 times, plus the previous one, should leave us with 32,766 rows in the table as shown here.



Let’s take a quick look at this table’s identity column usage by using the following query:

--Identity Column Usage

SELECT CONVERT(Numeric(18,2), ( ((CONVERT(Float, last_value) -

        CAST(a.seed_value as float)) / CONVERT(Float,

       CASE

        WHEN b.name = 'tinyint'   THEN 255 - CAST(a.seed_value as float)

        WHEN b.name = 'smallint'  THEN 32767 - CAST(a.seed_value as float)

        WHEN b.name = 'int'       THEN 2147483647 - CAST(a.seed_value as float)

        WHEN b.name = 'bigint'    THEN 9223372036854775807 - CAST(a.seed_value as float)

       END)) * 100.0)) AS "Percent"

       ,s.name as SchemaName

       ,t.name as TableName

       ,a.Name as IdentityColumn

       ,CONVERT(BIGINT, ISNULL(a.last_value,0)) AS last_value

       FROM sys.identity_columns a

       INNER JOIN sys.types b ON a.system_type_id = b.system_type_id

       LEFT JOIN sys.tables t ON t.object_id = a.object_id

       LEFT JOIN sys.schemas s ON s.schema_id = t.schema_id

WHERE s.name IS NOT NULL
AND
t.name = 'TestSmallIntIdentity';


Which shows that we are at 100% utilization of that identity column.


Now the big question, there are 32,766 items in the table, we know that it will hold values up to 32,767, but what happens when we insert another 1 or 2 rows?

 

INSERT INTO TestSmallIntIdentity DEFAULT VALUES;

 

SELECT count(*) FROM TestSmallIntIdentity;

 

This first one works fine and puts our count at 32,767, so let’s insert one more row…

INSERT INTO TestSmallIntIdentity DEFAULT VALUES;

 

This message is SQL Servers way of saying that the identity column is full.

Msg 8115, Level 16, State 1, Line 1

Arithmetic overflow error converting IDENTITY to data type smallint.

Arithmetic overflow occurred.

 

How do I fix it?

This is where it gets ugly. You can’t change the datatype of an IDENTITY value on an existing table when that column is a primary key, especially with foreign keys referencing that column.

It would be nice to just change the primary key like this:

ALTER TABLE TestSmallIntIdentity ALTER COLUMN TestID INTEGER;

But you get the following error:

Msg 5074, Level 16, State 1, Line 52

The object 'PK_TestSmallIntIdentity' is dependent on column 'TestID'.

Msg 4922, Level 16, State 9, Line 52

ALTER TABLE ALTER COLUMN TestID failed because one or more objects access this column.

 

First the easy case, if there are no tables with foreign key relationships, and it is just the primary key, you can do the following:

Drop the primary key constraint.
Alter the table and set the column to a larger size. For instance, TINYINT to SMALLINT or INTEGER to BIGINT.
Add the primary key constraint.

It would look something like this:

ALTER TABLE [dbo].[TestSmallIntIdentity] DROP CONSTRAINT [PK_TestSmallIntIdentity];

ALTER TABLE [dbo].[TestSmallIntIdentity] ALTER COLUMN TestID INTEGER;

ALTER TABLE [dbo].[TestSmallIntIdentity] ADD  CONSTRAINT [PK_TestSmallIntIdentity] PRIMARY KEY CLUSTERED

(

       [testID] ASC

);

GO

 

-- verify additional rows can be added

INSERT INTO TestSmallIntIdentity DEFAULT VALUES;

SELECT * FROM TestSmallIntIdentity;

SELECT COUNT(*) FROM TestSmallIntIdentity;

 

The more complex solution is if there are foreign key relationships in the tables. You would need to drop the foreign key relationship in the original table, resize the columns referencing it in the other tables, then re-enable all the foreign keys. Here are the steps.

1. Disable all the foreign keys associated with the original table.
2. Alter the columns that have foreign key relations to this table to a larger size.
3. Alter the column in this table to a larger size.
4. Add the primary key constraint back.
5. Re-enable all the foreign keys.

Oh, and did I mention to do this at a time when no one else is inserting anything new into the original table? It may make sense to do this in single user mode to guarantee that data isn’t being modified while you are making these changes.

How to prevent running out of values?

There are 2 options when you originally create a table.

Use a larger integer data type.
Seed the identity at the smallest value available for that data type.

On SQL Server 2012 you could consider the SEQUENCE object instead of an IDENTITY.

The first option of a larger integer data type is more expensive on the database, for instance if you just decided to set every identity to be a BIGINT, you would have a huge amount of overhead and waste on many of your tables.

The second option is seeding the identity to the smallest value available for the sequence.

BIGINT minimum value is -9,223,372,036,854,775,808
INT minimum value is -2,147,483,648
SMALLINT minimum value is -32,768
TINYINT minimum value is 0

There really isn’t anything to be gained with the TINYINT since its minimum value is 0, but with the 3 other INT data types, you double the capacity of your identity values by starting them at their minimum value instead of at 1.
Take a look at the following example, seeding the value with -32,768, instead of 1.

-- SMALLINT should accommodate -32768 to 32767 values

CREATE TABLE [dbo].[TestSmallIntIdentity2](

[testID] [smallint] IDENTITY(-32768,1) NOT NULL,

[logTime] [datetime] NOT NULL DEFAULT GetDate()

) ON [PRIMARY];

 

GO

 

INSERT INTO TestSmallIntIdentity2 DEFAULT VALUES;

 

SELECT * FROM TestSmallIntIdentity2;

SELECT count(*) FROM TestSmallIntIdentity2;

 

After inserting 1 row, we see the following output.



You can see that the first value inserted has an identity value of -32,768. Now what happens when we insert 37,268 values?

INSERT INTO TestSmallIntIdentity2 DEFAULT VALUES;

GO 32768

-- Run it 32,768 times

 

 

-- take a look at the last to be inserted.
SELECT TOP 10 *
FROM TestSmallIntIdentity2
ORDER BY testID DESC;

 

This produces the following output.



You can see, after inserting 32,769 rows, that we are now at an IDENTITY value of 0, rather than overflowing.

Let’s look at the same table using the identity column usage query from above. Be sure to change the table name to TestSmallIntIdentity2. From the results you can see that we are 50% full on our identity value rather than 100% full after inserting 32,768 rows.
 




Summary

Originally, I wondered why you would ever start an identity at anything other than its minimum value to take full advantage the capacity of the datatype you are using. Over time, I have experienced resistance from some developers and DBAs that are not comfortable with negative values in their identities. This has usually come from confusion from system users, analysts or others who get thrown off by the large negative number.

If you are not comfortable with negative values in the identity column, if it confuses business users, or for some reason doesn’t work in your environment, then I recommend sizing your identity values larger so that you don’t run out of space.

If you are okay with negative values, then I recommend starting your identities at the most negative number possible for the data type to not run out of values over time.

Either way, you can use the query above to check in on the overall identity column usage, so that you will know before you run out.