ttm wrote:
I'm guessing the reason for this is that some_string(n) doesn't return the nth character of some_string as a char, it returns it as some pseudo-char type
I would tend to think it returns a string of some length (not necessarily 1). Consider the code
Turing: |
var myString : string(1) :="" % we an empty string to an empty string (this is legal)
var myString2 : string := myString % we assign a string(1) to a string (legal)
var myChar : char := myString % we assign an empty string of length 1 to a char (not legal)
|
Notice that you can't always assign a string of length 1 to a character. In fact there are also certain characters that cannot be converted to strings (more on this later).
Also, string(n) and myString(n) have different meanings. string(n) is a type string of length n. myString(n) is the character(s) at index n. Example.
Turing: |
var myString : string(1) := "H" % A string of length 1
var myString2 : string(10) := "Hello World" % a string of length 10
put myString2(1) % will produce H (in principle this could be a char or string)
put mystring2(1..5)% will produce Hello (this is obviously a string)
|
We can conclude from this that there is no reason to believe myString (n) should produce a char rather than a string of length 1. You might be wondering "What's the difference?", well let's see what the turing documentation has to say.
http://compsci.ca/holtsoft/doc/char.html wrote:
The char type differs from the string(1) type in the following way: char always represents exactly one character, while string(1) can represent either the null string or a string containing one character.
[...]
There are 256 char values, corresponding to the distinct patterns in an 8-bit byte. This allows the patterns eos (internal value 0) and uninitchar (internal value 128) to be char values (these patterns are not allowed in the string type; see the string type).
More reading yields:
http://compsci.ca/holtsoft/doc/char_n.html wrote:
A char value can be assigned (or passed to an non var parameter) with automatic conversion to a char(1) variable and vice versa. String values of length 1 can be assigned to char variables. Character (char) values can be assigned to string variables, yielding a string of length 1.
Notice that it is explicitly stated that char can be passed as a char(1) parameter, but it is not explicitly stated that this is so for char and string (1) (it only says we can assign the values to variables).
So now it should be obvious that the problem is not a bug, but rather a consequence of using different variable types. string(1) is not a char, thus it cannot be passed as a char type index without forcing conversion, through some process like your function or assigning it to a variable.
A solution you may wish to implement is to use the char(n) type.
Turing: |
var table : array char of int
for i : char
table(i) := Rand.Int(1, 100)
end for
var line : char(13) := "Hello, World!"
put table(line(1)) |
I admit this may not be ideal as we lose much of our string functionality. Since the length n must be known at compile time, you could declare a variable of type char(255). However, the char(n) type has no uninitialized value nor a length value, which may annoy you. You could do what strings do and implement that character values 0 and 128 are for eos (end of string) and uninitchar (uninitialized) but now you may be doing more work than you would have had you just stuck to strings.
Hopes this helps clear up why this doesn't work the way you would like.