Lua integers ending in ".0" are counted as incorrect


#1

Sometimes an integer in Lua will be a double. Converting it to a string adds “.0” to the end of it. This is always evaluated as wrong. In a competition I had to write print(sum/N%1 == 0 and sum/N|0 or sum/N) instead of print(sum/N) to account for the special case where sum/N is an integer because of this broken behavior.


#2

And? That’s a “feature” of LUA.
https://www.lua.org/manual/5.3/manual.html#3.4.1

print(sum//N)


#3

// is for integer division. The problem allowed for sum/N to not be an integer, I just got it wrong whenever it was because it ended in .0. It shouldn’t be counted as wrong. The broken behavior is the website counting the answer as wrong, not the language.


#4

Ah OK, I misunderstood. Unfortunately yes, solutions must match exactly at char level.
An extra space or 1 instead 1.0 fails the validator.
It also happens on other languages too with different stuff. /shrugs


#5

It is annoying and feels broken. I frequently have to spam ~~ or |0 or math.floor(an integer) to tell it to get rid of the “.0” when it prints, and in special cases where a decimal is allowed, I have to check if the decimal is a 0 so it doesn’t break again. It adds characters to my golfed programs and takes longer to write in competitions.

example: one clash of code problem was “print the sum of the first N odd integers.” I write print(io.read()^2) and it’s wrong. I have to write print(io.read()^2|0), to make it a “real” integer first.

better example: I’m supposed to print the average of x1 and x2 and then the average of y1 and y2 separated by a space.

local a, b = (x1+x2)/2, (y1+y2)/2
if a%1 == 0 then a=a|0 end
if b%1 == 0 then b=b|0 end
print(a.." "..b)

I really would have liked to simply have written print(((x1+x2)/2)..((y1+y2)/2)) but I knew that wasn’t going to work because of this bug.


#6

This is not broken or a bug, but the new default behaviour of Lua 5.3 that CG uses (I guess you still use Lua 5.2 locally).


#7

The new behavior in 5.3 adds the .0 and answers are not accepted because of it, I’m not using 5.2 (which didn’t have distinct floats and ints and didn’t add .0). The broken behavior is the fact that answers are not accepted, not that the language does this.


#8

Well, CG puzzles use strictly textual in/out. CG “games” (or any stuff with a visualization) can (but do not necessarily) call some custom code to parse/validate the output.
In the first case, the output format should ideally be neatly specified and uniform (e.g. floating point number with exactly two decimal places after the dot), so that what you point out here should ideally never be a problem.
In the second case, the output could be less strictly specified but the validation code should be more lenient (and certainly not more strict), so that again what you point out here should ideally never be a problem.
So I guess you’re right, but it’s not some general broken mechanic of the platform, it’s more of a conception problem of one or more specific puzzles/games.