Module llex

Lua 5.1+ lexical analyzer written in Lua.

This file is part of LuaSrcDiet, based on Yueliang material.

Notes:

  • This is a version of the native 5.1.x lexer from Yueliang 0.4.0, with significant modifications to handle LuaSrcDiet's needs: (1) llex.error is an optional error function handler, (2) seminfo for strings include their delimiters and no
    translation operations are performed on them.
    
  • ADDED shbang handling has been added to support executable scripts.
  • NO localized decimal point replacement magic.
  • NO limit to number of lines.
  • NO support for compatible long strings (LUA_COMPAT_LSTR).
  • Added goto keyword and double-colon operator (Lua 5.2+).

Functions

lex (source, source_name) Runs lexer on the given source code.

Local Functions

addtoken (token, info) Adds information to token listing.
inclinenumber (i, is_tok) Handles line number incrementation and end-of-line characters.
chunkid () Returns a chunk name or id, no truncation for long names.
errorline (s, line) Formats error message and throws error.
skip_sep (i) Counts separators (=" in a long string delimiter.
read_long_string (is_str, sep) Reads a long string or long comment.
read_string (del) Reads a string.
init (_z, _sourceid) Initializes lexer for given source _z and source name _sourceid.


Functions

lex (source, source_name)
Runs lexer on the given source code.

Parameters:

  • source string The Lua source to scan.
  • source_name optional string Name of the source (optional).

Returns:

  1. {string,...} A list of lexed tokens.
  2. {string,...} A list of semantic information (lexed strings).
  3. {int,...} A list of line numbers.

Local Functions

addtoken (token, info)
Adds information to token listing.

Parameters:

inclinenumber (i, is_tok)
Handles line number incrementation and end-of-line characters.

Parameters:

  • i int Position of lexer in the source stream.
  • is_tok bool

Returns:

    int
chunkid ()
Returns a chunk name or id, no truncation for long names.

Returns:

    string
errorline (s, line)
Formats error message and throws error.

A simplified version, does not report what token was responsible.

Parameters:

  • s string
  • line int The line number.

Raises:

skip_sep (i)
Counts separators (=" in a long string delimiter.

Parameters:

  • i int Position of lexer in the source stream.

Returns:

    int
read_long_string (is_str, sep)
Reads a long string or long comment.

Parameters:

Returns:

    string

Raises:

if unfinished long string or comment.
read_string (del)
Reads a string.

Parameters:

Returns:

    string

Raises:

if unfinished string or too large escape sequence.
init (_z, _sourceid)
Initializes lexer for given source _z and source name _sourceid.

Parameters:

  • _z string The source code.
  • _sourceid string Name of the source.
generated by LDoc 1.4.6