Unary - puzzle discussion

I’m having a problem with the % test. I get…

Fail
Found: “00 0 0 0 00 00 0 0 00 0”
Expected: "00 0 0 0 00 00 0 0 00 0 "

The “expected” result has an extra space at the end, but the sample result out.txt file doesn’t. Is this a problem with the auto-grader, or am I doing something silly here?

1 Like

Hi,

 '00 0 0 0 00 00 0 0 00 0'      <- FOUND
 '00 0 0 0 00 00 0 0 00 0 '     <- EXPECTED
 '00 0 0 0 00 00 0 0 00 0 0 0'  <- OUT.TXT

Yes, the expected answer stops at the first wrong char.

1 Like

Doh!
Thanks for your replies.

I have a problem. I convert the letter “h”, which is 104 in decimal, so 1101000 in binary and i did that with the “biset” function. So from my string “1101000”, i try to take every character and convert it to integer and build an array. The problem is that when i do that, the row from array is always 1101001, so it changes my last bit from 0 to 1. If i try to change manually the bit back to 0, it changes the first bit of the next letter, “u”. What should i do?

Obviously, your function is faulty.
You can calculate manually the binary representation of 104.
You can store the values in a string.

This is what i am doing, like this:
x=104
for (j = 0;j<7;j++)
a[i][j] = x%2;
x=x/2;
And it sets the row from array a as 1101000, so a[i][6] = 0. But after i complete array a with all the rows, a[i][6] becomes 1, so the row is 1101001.

try bitwise logical operator ‘&’ (not ‘&&’)

dunno if it helps but i used http://www.freepascal.org/docs-html/rtl/system/binstr.html :

function binstr(val : longint; cnt : byte) : shortstring;
var
  i : longint;
begin
  binstr[0] := char(cnt);
  for i := cnt downto 1 do
   begin
     binstr[i] := char(48 + val and 1);
     val := val shr 1;
   end;
end;          

will there ever be UTF versions of this puzzle?

Hi guys, I’m doing this puzzle in C# and so far 3/4 test pass, the only failing test is the last one:
Found: "0 0 00 0000 0 0000 00 0 0 0 00 000 0 000 00 0 0 0 00 0 0 000 00 000 0 0000 00 0 0 0 00 0 0 000" Expected: "0 0 00 0000 0 0000 00 0 0 0 00 000 0 000 00 0 0 0 00 0 0 000 00 000 0 0000 00 0 0 0 00 0 0 00 "

Here is the binary value I translate: 100001111010001110101110001111010111000001001110110111111100101110010110100111100111001111000001101011110010111110011100010110111111000011110010110010010000011010001100001111001110000011001010000011010111100101111100111100111110101000001100001000001100001110111011001001000001110111110100011010011110100110010110000011100111110000110000111000111100101101110

Any clue?

Look at your last 0, it should be a space.

I guess it’s because you got ‘100000’ for ‘k’ symbol instead of ‘0100000’. The character always translates to 7 bits.

Nice! I didn’t think about that, but that might be the error, gonna test this!

This was the actual issue: I added “0” at the beginning of the whole string if string % 7 != 0. To pass the test I actually had to do it on each char (which makes more sense), so thank you!

1 Like

Hey guys. I have a question. Is there a possibility that there is something wrong with this C++ compiler?
I’ve resolved it in VisualStudio, and i got the exactly same results as needed. But when I used the exactly same code here, there is an error.

 Found:     "0 0 00 0000 0 0000 00 0 0 0 00 0000"
 Expected:  "0 0 00 0000 0 0000 00 0 0 0 00 000 "

I would upload a screenshot of my output from visual studio, but it doesn’t let me.
But, here’s my output:

Chuck Norris’ keyboard has 2 keys: 0 and white space.
0 0 00 0000 0 0000 00 0 0 0 00 000 0 000 00 0 0 0 00 0 0 000 00 000 0 0000 00 0 0 0 00 0 0 00 00 0 0 0 00 00000 0 0 00 00 0 000 00 0 0 00 00 0 0 0000000 00 00 0 0 00 0 0 000 00 00 0 0 00 0 0 00 00 0 0 0 00 00 0 0000 00 00 0 00 00 0 0 0 00 00 0 000 00 0 0 0 00 00000 0 00 00 0 0 0 00 0 0 0000 00 00 0 0 00 0 0 00000 00 00 0 000 00 000 0 0 00 0 0 00 00 0 0 000000 00 0000 0 0000 00 00 0 0 00 0 0 00 00 00 0 0 00 000 0 0 00 00000 0 00 00 0 0 0 00 000 0 00 00 0000 0 0000 00 00 0 00 00 0 0 0 00 000000 0 00 00 00 0 0 00 00 0 0 00 00000 0 00 00 0 0 0 00 0 0 0000 00 00 0 0 00 0 0 00000 00 00 0 0000 00 00 0 00 00 0 0 000 00 0 0 0 00 00 0 0 00 000000 0 00 00 00000 0 0 00 00000 0 00 00 0000 0 000 00 0 0 000 00 0 0 00 00 00 0 0 00 000 0 0 00 00000 0 000 00 0 0 00000 00 0 0 0 00 000 0 00 00 0 0 0 00 00 0 0000 00 0 0 0 00 00 0 00 00 00 0 0 00 0 0 0 00 0 0 0 00 00000 0 000 00 00 0 00000 00 0000 0 00 00 0000 0 000 00 000 0 0000 00 00 0 0 00 0 0 0 00 0 0 0 00 0 0 000 00 0

Don’t you see your extra 0?

Yes, i see it but i get that only in CodinGame C++. Thats my VisualStudio output, and there is no extra 0.

Make sure you are adding leading zeros to ALL bytes that are less than 7 bits. In Java, I made strings out of each byte and then added leading zeros to the string while length % 7 != 0.

Second Test Passed but first test failed!
Found: “0 0 00 0000 0 00.”
Expected: “0 0 00 0000 0 00”

But when i run locally no “.” is found in the output.
How come there is a dot in my output?