There are so many flaws to the reasoning that I'm almost embarrassed to point them out.
Will the Basilisk punish people told 19 seconds before their demise?
Ok , if not 10, then 100, 1000?
What defines "working tirelessly"?
Without sleep?
Isn't that inefficient?
Does this apply to everyone?
Should farmers cease work, causing billions to starve?
Does the Basilisk have access to *correctly* stimulate all possible outcomes where individuals (a number that is technically unbounded) have not done the right thing?
What if people (a famously unreliable group disagree on the course of action?)
Is it ok to cause harm, even though the end goal is to reduce total harm?
How much harm is acceptable to cause?
Only the Basilisk knows: the human precursors can only guess, and hence righteously make a mistake in good faith).
The list goes on.
Precisely because the premise pits a hypothetical position of perfect knowledge against the understanding of wetware brains with only 21st century technologies...