A TokenStream is a list of tokens, gathered during the parse of some entity (say a method). Entities populate these streams by being registered with the lexer. Any class can collect tokens by including TokenStream. From the outside, you use such an object by calling the start_collecting_tokens method, followed by calls to add_token and pop_token.
Adds tokens to the collected tokens
# File lib/rdoc/tokenstream.rb, line 15
15: def add_tokens(*tokens)
16: tokens.flatten.each { |token| @token_stream << token }
17: end
Starts collecting tokens
# File lib/rdoc/tokenstream.rb, line 24
24: def collect_tokens
25: @token_stream = []
26: end
Remove the last token from the collected tokens
# File lib/rdoc/tokenstream.rb, line 33
33: def pop_token
34: @token_stream.pop
35: end
Disabled; run with --debug to generate this.
Generated with the Darkfish Rdoc Generator 1.1.6.