Tokens#

Qualified name: makros.macro\_creation.Tokens

class makros.macro_creation.Tokens(tokens: List[TokenInfo], filename: str)#

Bases: object

A helper class that wraps around a list of tokens, providing common methods that might be needed for writing a recursive decent parser

Methods

advance

Goes forward one token

check

Checks the next token against the next token in the buffer

consume

Will consume the next token if it matches the checker, otherwise it will raise an error

error

Will print an error message at the specific token, including context, to help the programmer figure out what is going wrong

is_at_end

Returns true if the next token is an end marker

match

Matches any of the provided cases against the next token.

peek

Returns what the next token will be without modifying the current toke in the buffer

previous

Returns the token before the current one

advance() TokenInfo#

Goes forward one token

Returns:

tokenize.TokenInfo: The token that was just passed over

check(checker: TokenCase) bool#

Checks the next token against the next token in the buffer

Args:

checker (TokenCase): The token checking case that you want to check

Returns:

bool: If the token matches the case

consume(checker: TokenCase, failure_message: str) TokenInfo#

Will consume the next token if it matches the checker, otherwise it will raise an error

Args:

checker (TokenCase): The case that will be checked against failure_message (str): The error message that you want to provide to the user

Returns:

tokenize.TokenInfo: The consumed token

error(token_something_else: TokenInfo, message: str)#

Will print an error message at the specific token, including context, to help the programmer figure out what is going wrong

Args:

token_something_else (tokenize.TokenInfo): The token that the error occurred at message (str): Your human readable error message

Raises:

Exception: The error for a stack trace

is_at_end() bool#

Returns true if the next token is an end marker

match(*types: TokenCase) bool#

Matches any of the provided cases against the next token. Advances if it finds a new one

Returns:

bool: If it has found a match to any of the different checkers

peek() TokenInfo#

Returns what the next token will be without modifying the current toke in the buffer

previous() TokenInfo#

Returns the token before the current one

Returns:

tokenize.TokenInfo: The last token