BASICTokenise


Contents


Introduction and Overview

BASIC files are tokenised when stored as file type BASIC (&FFB). Usually the tokenisation process it performed by the BASIC module. However, this isn't always possible - for example, if the process is being run on a non-RISC OS system. The *BASICTokenise tool can be used to convert untokenised BASIC Text (filetype &FD1) to Tokenised BASIC.


*Commands


*BASICTokenise

Convert a BASIC Text file to tokenised BASIC file
Syntax
*BASICTokenise <input> <output>
Parameters
<input>-filename of the text file to read
<output>-filename of the tokenised BASIC file to write to
Use

This command is used to tokenise a textual BASIC program. *BASICTokenise will process the text file to produce a BASIC file.

The tool is only capable of converting simple BASIC code, without line number references.

Example
*BASICTokenise Source !RunImage
Related APIs
None

Document information

Maintainer(s): Charles Ferguson <gerph@gerph.org>
History:
RevisionDateAuthorChanges
1GerphInitial version
Disclaimer: © gerph, 2020.