In MariaDB 10.2 (and MySQL 5.7, MySQL 8.0), executing SELECT 1; returns a text result set where the column definition packet specifies a type of MYSQL_TYPE_LONGLONG (0x08), i.e., a 64-bit integer.
In MariaDB 10.3, the same query returns a text result set with the type MYSQL_TYPE_LONG (0x03), i.e., a 32-bit integer.
I've attached two packet captures that show the difference: dump-10_2.pcap and dump-10_3.pcap.
In .NET, there are workarounds, such as Convert.ToInt32. And perhaps this kind of statement is rare in practice. And while the protocol documentation doesn't specify (AFAIK) how numeric literals should be represented on the wire, I didn't find documentation of this as a deliberate change in behaviour.
Note that SELECT 100000000000; (or some other number outside of the range of a 32-bit signed integer) will automatically promote the type in the result set to LONGLONG.