Gyoji Compiler
Loading...
Searching...
No Matches
Public Member Functions | List of all members
Gyoji::context::TokenStream Class Reference

Stream of tokens read by the parser to provide context for errors. More...

#include <token-stream.hpp>

Public Member Functions

 TokenStream ()
 
 ~TokenStream ()
 
const std::vector< Gyoji::owned< Token > > & get_tokens () const
 
const SourceReferenceget_current_source_ref () const
 
std::string get_line (size_t _line) const
 
const Tokenadd_token (TokenID _typestr, std::string _value, const std::string &_filename, size_t _line, size_t _column)
 
std::vector< std::pair< size_t, std::string > > context (size_t line_start, size_t line_end) const
 
void append_token (std::string _value)
 

Detailed Description

Stream of tokens read by the parser to provide context for errors.

The token stream represents the list of all tokens that were encountered while parsing the input file. Each token corresponds to a match rule in the lexical analysis stage (gyoji.l). The tokens also contain metadata indicating the location where the token was found as well as the rule that matched the token. The token stream can be used to exactly reproduce the input and is useful in constructing structured error messages where it is useful to have some context of the original source file.

Constructor & Destructor Documentation

◆ TokenStream()

TokenStream::TokenStream ( )

Creates an empty token stream to hold token data.

◆ ~TokenStream()

TokenStream::~TokenStream ( )

Destructor, nothing fancy.

Member Function Documentation

◆ add_token()

const Token & TokenStream::add_token ( TokenID  _typestr,
std::string  _value,
const std::string _filename,
size_t  _line,
size_t  _column 
)

This is used by the lexical analysis stage (gyoji.l) when it matches a rule for a token. It sets the token type (match rule) as well as that data that matched the token and the source line number and column where the token was found.

◆ append_token()

void TokenStream::append_token ( std::string  _value)

This method is used to append a value to the most recently added token. Note that this has no effect if no tokens have been added yet. This is used internally for parsing things like multi-line comments where a single rule may not match the entire token, so it is broken up into several match rules such as C-style multi-line comments.

◆ context()

std::vector< std::pair< size_t, std::string > > TokenStream::context ( size_t  line_start,
size_t  line_end 
) const

This returns a list of lines from the source file starting with line_start and ending with line_end (inclusive). This is useful in providing context to structured errors.

Parameters
line_startStart line number to retrieve.
line_endLast line number to retrieve.
Returns
This returns a pair of line number and line text for the matched lines from the source file.

◆ get_current_source_ref()

const SourceReference & TokenStream::get_current_source_ref ( ) const

Returns the most recent source reference found. If no prior source reference was found, this must be the first token in the file and we will return the most recent one.

◆ get_line()

std::string TokenStream::get_line ( size_t  _line) const

This returns the exact text of a single line of source-data. This is useful in constructing the context for structured error messages.

◆ get_tokens()

const std::vector< Gyoji::owned< Token > > & TokenStream::get_tokens ( ) const

This returns a list of all of the tokens found during the parse. This is returned as an immutable list of pointers owned by the token stream.


The documentation for this class was generated from the following files: