BASIC files are tokenised when stored as file type BASIC (&FFB). Usually the tokenisation process it performed by the BASIC module. However, this isn't always possible - for example, if the process is being run on a non-RISC OS system. The *BASICTokenise
tool can be used to convert untokenised BASIC Text (filetype &FD1) to Tokenised BASIC.
*BASICTokenise
<input>
<output>
<input> | - | filename of the text file to read |
<output> | - | filename of the tokenised BASIC file to write to |
This command is used to tokenise a textual BASIC program.
*BASICTokenise
will process the text file to produce a BASIC
file.
The tool is only capable of converting simple BASIC code, without line number references.
*BASICTokenise Source !RunImage
Maintainer(s): | Charles Ferguson <gerph@gerph.org> | ||||||||
---|---|---|---|---|---|---|---|---|---|
History: |
| ||||||||
Disclaimer: | © gerph, 2020. |