I have an array like this:
$array = ['1' , '2' ,'100'];
and my query is :
$query = Customer::whereIn('id', $array)->get();
When the $array
is too large (1500 items and above) , it's not work and return a null value.
Also I increased the max_allowed_packet
in MySQL config. but problem not solved.
PHP v7.33 , laravel v7.19 , MySQL v 5.7
Ok, this took me 6 straight hours, but I managed to find a deep explanation of the problem !
The problem is in fact not with eloquent itself, but with the combinaison of both eloquent and the db to optimise queries.
Eloquent does prepare all queries with PDO::prepare()
. In the case of a whereIn() query, the result is something like
PREPARE SELECT * FROM model WHERE id IN (?, ?, ?, ?, ?, ?, ?, ?, ?, .....);
On his side, the db (mySQL, mariaDB) has a variable named in_predicate_conversion_threshold
for specifically optimising the IN() clause. By default the value is 1000
, which mean the query will be converted to an optimised version implying subqueries.
So, binding more than 999 items in a IN clause can result in errors in which the DB will return 0 results without errors.
Workaround 1: Use a raw queries on laravel side.
Workaround 2: Use unprepared statements.
Workaround 3: Do chunk for selects or updates.
Workaround 4: Increase or remove the threshold for the IN optimisation (value of 0 = no optimisation).
For the last point, just execute SET in_predicate_conversion_threshold = 0;
before your query, or set it up globally in the config file. (do not execute the query directly on the command line as this is session-based modifications)
I would like to put an UP on this issue because we encounter it in our infrastructure as well.
The biggest problem isn't that Eloquent return null or empty, but the fact that it doesn't produce error at all, making it very hard to debug.
Use whereIntegerInRaw which makes sure the array is only integers,but skips prepare
Customer::whereIntegerInRaw('id', $array)->get();