Data quality in SQL Server // TRY_CONVERT for bigint, int, smallint and tinyint done safely

Data quality starts with type conversion — and for integer columns, TRY_CONVERT has a few quirks that tend to slip past the import path: an empty string becomes 0 (not NULL), any decimal or thousands separator yields NULL, and a typed decimal is silently truncated — without rounding. This article sorts out the rules, shows the safe conversion pattern, and adds a Postgres bridge.

At a glance:

  • The four integer types cover clearly tiered value ranges — from 1-byte tinyint (0..255, unsigned) up to 8-byte bigint.
  • When converting from text, TRY_CONVERT accepts only integer representations — a sign and whitespace are allowed, while decimal or thousands separators yield NULL, regardless of locale.
  • The empty string is the edge case: it returns 0, which is usually wrong semantically — a CASE/TRIM pattern cleanly maps it to NULL instead.
  • In Postgres, CAST(... AS bigint) throws an exception instead of returning NULL; a dedicated PL/pgSQL wrapper try_cast_bigint closes that gap.

Prerequisite: SQL Server 2017+ (TRY_CONVERT has shipped since 2012; the safe pattern below uses TRIM, which requires SQL Server 2017 — on 2012–2016, use LTRIM(RTRIM(...)) instead). Examples run without a sample dataset — pure inline literals.

Content

Microsoft Learn for TRY_CONVERT (and for CAST and CONVERT) states that the value to be converted can be any expression. That means both nvarchar strings from CSV/JSON/XML imports and typed numbers (e.g. decimal(18,5) from a pipeline calculation) are valid inputs. This article distinguishes between the two cases — Text and Non-Text — and presents a safe conversion pattern for each.

Value range of the integer types

Type choice comes before conversion. The four integer types in SQL Server differ only by value range and storage size — the conversion rules are identical.

Data typeMinMaxBytesTypical use
bigint-9 223 372 036 854 775 8089 223 372 036 854 775 8078Huge surrogate keys, counters above billions, snowflake IDs
int-2 147 483 6482 147 483 6474Standard surrogate key, quantities, counters in the 9-digit range
smallint-32 76832 7672Years, small quantities, legacy lookup keys
tinyint02551Flags, status codes, small lookup values — unsigned!

tinyint is a Microsoft quirk: the only SQL Server integer type without a sign. A negative value cannot be converted to tinyintTRY_CONVERT returns NULL. Postgres has no tinyint at all — the smallest integer there is smallint (signed, -32 768..32 767).

Text

When a text needs to be converted to an integer value, the following scenarios are worth examining:

  • The text contains a valid number
  • The number uses a comma as the decimal separator
  • The text contains thousands separators, possibly with a decimal separator
  • The text is an empty string or contains only spaces

The following TRY_CONVERT calls cover these cases. The returned result is noted after each statement:

  1: SELECT TRY_CONVERT(int, NULL    ) -- NULL
  2: SELECT TRY_CONVERT(int, N'123'  ) -- 123
  3: SELECT TRY_CONVERT(int, N'123,4') -- NULL
  4: SELECT TRY_CONVERT(int, N'1,234') -- NULL
  5: SELECT TRY_CONVERT(int, N'123.4') -- NULL
  6: SELECT TRY_CONVERT(int, N'1.234') -- NULL
  7: SELECT TRY_CONVERT(int, N''     ) -- 0
  8: SELECT TRY_CONVERT(int, N' '    ) -- 0
  9: SELECT TRY_CONVERT(int, N' 123' ) -- 123
 10: SELECT TRY_CONVERT(int, N'123 ' ) -- 123

Numbers that are clearly integers convert as expected. Leading and trailing spaces do not affect the result. Decimal numbers with a separator — whether comma (German notation) or period (American notation) — cannot be converted to int and return NULLTRY_CONVERT is locale-independent: neither SET LANGUAGE nor SET DATEFORMAT influence how a decimal or thousands separator is interpreted.

The result already holds a surprise. Empty strings or strings containing only spaces convert to 0. While the NULL result for decimal separators is reasonable, the 0 conversion of an empty string is usually wrong semantically in an ETL context: an empty CSV field is “unknown”, not “0”. The safe variant follows further down as a pattern.

Non-Text

When the input is not text but already typed (an integer with a different value range, or a decimal), the scenarios reduce to:

  • The value passed is an integer (possibly outside the target range).
  • The value passed is a decimal number.

The first three tests:

  1: SELECT TRY_CONVERT(int, 2147483648) -- NULL
  2: SELECT TRY_CONVERT(int,        123) -- 123
  3: SELECT TRY_CONVERT(int,     1234.5) -- 1234

The first number is 1 greater than int MAX (2 147 483 647). The call returns NULL as expected — typical overflow protection. Decimal numbers can be converted, but SQL Server does not roundTRY_CONVERT(int, 1234.5) returns 1234, not 1235. If you want rounding, apply it explicitly before the conversion — ROUND(1234.5, 0) returns 1235.0, which then converts to 1235.

The range boundaries per type are covered by a second block:

  1: SELECT TRY_CONVERT(smallint,  32767); -- 32767  (smallint-MAX)
  2: SELECT TRY_CONVERT(smallint,  32768); -- NULL   (Overflow)
  3: SELECT TRY_CONVERT(tinyint,     255); -- 255    (tinyint-MAX, unsigned)
  4: SELECT TRY_CONVERT(tinyint,     256); -- NULL   (Overflow)
  5: SELECT TRY_CONVERT(tinyint,      -1); -- NULL   (tinyint ist unsigned)

Safe type conversion

The two blocks above expose two edge cases that must be handled explicitly in the import path:

  • An empty string converts to 0. If that is semantically wrong (the default case for CSV imports), the empty string must be mapped to NULL before the TRY_CONVERT.
  • typed decimal number is truncated to the integer part, not rounded. If rounding is required, slip a ROUND(..., 0) in before the TRY_CONVERT — or set the target type to decimal(p, 0) instead of int, where SQL Server silently rounds (see FAQ).

Usage example with DECLARE and the snake_case variable convention:

  1: DECLARE @p_input AS nvarchar(30);
  2: SET @p_input = N'123';
  3: 
  4: SELECT TRY_CONVERT(int,
  5:                    CASE WHEN TRIM(@p_input) = '' THEN NULL ELSE @p_input END
  6:                   ) AS [Output];

If you need the pattern across many columns in an ETL process, abstract it into a user-defined function fn_try_convert_int(@p_input nvarchar) — see Design Pattern // Safe Type Conversion with T-SQL (DE only) for a deeper treatment.

Postgres bridge

In multi-engine ETL pipelines the question of a Postgres counterpart comes up. Two differences from the SQL Server stack are decisive:

  • CAST(s AS bigint) is Postgres’ default conversion — and it throws an exception on invalid input. There is no direct TRY_CONVERT counterpart; the call either succeeds or aborts.
  • Even Postgres 18 (released 2025-09-25) does not ship a built-in counterpart — neither a try_cast function nor the SQL/JSON CAST ... ON ERROR NULL syntax from SQL:2023 made it into 18.0. You still write the NULL-instead-of-exception wrapper yourself as a PL/pgSQL function.

Wrapper function as PL/pgSQL counterpart to TRY_CONVERT(int, …):

  1: CREATE OR REPLACE FUNCTION try_cast_bigint (p_input text)
  2:    RETURNS bigint
  3:    LANGUAGE plpgsql
  4:    IMMUTABLE
  5: AS $$
  6: BEGIN
  7:    IF p_input IS NULL OR TRIM(p_input) = '' THEN
  8:       RETURN NULL;
  9:    END IF;
 10:    RETURN p_input::bigint;
 11: EXCEPTION
 12:    WHEN invalid_text_representation OR numeric_value_out_of_range
 13:    THEN
 14:       RETURN NULL;
 15: END;
 16: $$;

With this, try_cast_bigint('1,234') behaves like TRY_CONVERT(int, '1,234'): the call returns NULL instead of raising. Analogous wrappers exist for intsmallint — Postgres has no tinyint type; the smallest integer type there is smallint (signed, -32 768..32 767).

When the input contains formatted numbers with thousands or decimal separators (German or US notation), to_number with a format pattern is the right tool:

  1: SELECT to_number('1.234,56', 'FM999G999D99');  -- 1234.56  (deutsche Notation, lc_numeric = 'de_DE.UTF-8')
  2: SELECT to_number('1,234.56', 'FM999G999D99');  -- 1234.56  (US-Notation, lc_numeric = 'en_US.UTF-8')

to_number is locale-dependent via the session variable lc_numeric — unlike SQL Server’s TRY_CONVERT. In an ETL context it is worth setting lc_numeric explicitly per session, so the import is decoupled from the server’s default configuration.

SQL Server TRY_CONVERTPostgres counterpartNote
TRY_CONVERT(bigint, s)try_cast_bigint(s) (wrapper)NULL instead of exception
TRY_CONVERT(int, s)try_cast_int(s) (wrapper)analogous
TRY_CONVERT(smallint, s)try_cast_smallint(s) (wrapper)analogous
TRY_CONVERT(tinyint, s)Postgres has no tinyint type
TRY_CONVERT(int, s) with separatorsto_number(s, 'FM999G999')locale-dependent via lc_numeric

Summary

TRY_CONVERT is the default for integer imports — as long as the edge cases are understood. An empty string becomes 0 (usually wrong semantically), decimal numbers are truncated (not rounded), decimal separators yield NULL (locale-independent). The safe pattern maps the empty string to NULL via CASE/TRIM; rounding is applied as ROUND(..., 0) before the TRY_CONVERT.

Take-Away:

  • Type choice by expected value range: tinyint (0..255) for flags, int (~2.1 B) for the 95% standard case, bigint for anything that does not fit int.
  • TRY_CONVERT as the default for CSV/JSON/XML imports — no runtime error, just a silent NULL fallback.
  • Empty string is usually NULL semantics in an import context, not 0 semantics — apply the safe pattern from “Safe type conversion”.
  • Postgres bridge via a PL/pgSQL wrapper try_cast_bigint (analogous for intsmallint); to_number for formatted numbers with lc_numeric awareness.

FAQ

Why does TRY_CONVERT(int, '1,234') return NULL?

TRY_CONVERT is locale-independent and recognizes neither decimal nor thousands separators. The call expects a pure digit string (optionally with a sign; leading and trailing whitespace are tolerated). The moment a comma or period appears in the string, the conversion fails — regardless of SET LANGUAGE or SET DATEFORMAT. If the source delivers German notation ('1.234,56'), normalize via REPLACE first: REPLACE(REPLACE(@p_input, '.', ''), ',', '.') turns '1.234,56' into '1234.56', which can then be converted with TRY_CONVERT(decimal(18,2), …).

int or bigint — which type when?

int covers any value range up to roughly ±2.1 billion — surrogate keys, counters, quantities. bigint comes in once the counter grows past the int limit (global ID generators, Twitter/snowflake IDs) or when storage efficiency does not matter. Rule of thumb: int as the default, bigint only when there is explicit evidence the range will be exceeded. smallint and tinyint only pay off in tables with billions of rows, where byte frugality becomes measurable — otherwise the maintenance cost (overflow risk as the schema grows) is not justified.

How do I force rounding instead of truncation when converting 1234.5?

TRY_CONVERT(int, 1234.5) returns 1234 — truncation. For commercial rounding, slip an explicit ROUND in first: TRY_CONVERT(int, ROUND(1234.5, 0)) returns 1235Note: decimal(p, 0) is not a strict drop-in for int — SQL Server silently rounds during scale reduction (TRY_CONVERT(decimal(10, 0), 1234.5) returns 1235not NULL and no error). There is no out-of-the-box mode that signals the loss of decimal places as NULL; only SET NUMERIC_ROUNDABORT ON turns the silent rounding step into an error (default: OFF).

What to do if a CSV column mixes commas and periods as decimal separators?

Normalize via REPLACE before the TRY_CONVERT. If you know that comma = decimal separator and period = thousands separator (German notation): REPLACE(REPLACE(@p_input, '.', ''), ',', '.'). If the column mixes both notations, you cannot reliably distinguish them via regex ('1,234' is ambiguous) — at that point the input data source is the problem, not the converter. When in doubt: document per record which locale the source uses.

Postgres counterpart to TRY_CONVERT(int, …)?

There is no direct counterpart — CAST(s AS int) throws an exception instead of returning NULL, and Postgres 18 (released 2025-09-25) did not add a built-in try_cast or the SQL/JSON CAST ... ON ERROR NULL syntax either. Solution (as of 18.0): a PL/pgSQL wrapper try_cast_int(p_input text) RETURNS int with EXCEPTION WHEN invalid_text_representation OR numeric_value_out_of_range THEN RETURN NULL (see Postgres bridge). For formatted numbers with thousands or decimal separators, to_number(s, format) is the Postgres path, but locale-dependent via lc_numeric — unlike SQL Server’s locale independence.

Related Posts

2 thoughts on “Data quality in SQL Server // TRY_CONVERT for bigint, int, smallint and tinyint done safely”

Comments are closed.