• Top
    • Documentation
    • Books
    • Boolean-reasoning
    • Projects
      • Apt
      • Zfc
      • Acre
      • Milawa
      • Smtlink
      • Abnf
      • Vwsim
      • Isar
      • Wp-gen
      • Dimacs-reader
      • Pfcs
      • Legacy-defrstobj
      • Proof-checker-array
      • Soft
      • C
      • Farray
      • Rp-rewriter
      • Instant-runoff-voting
      • Imp-language
      • Sidekick
      • Leftist-trees
      • Java
      • Taspi
      • Bitcoin
      • Riscv
      • Des
      • Ethereum
      • X86isa
      • Sha-2
      • Yul
        • Transformations
        • Language
          • Abstract-syntax
          • Dynamic-semantics
          • Concrete-syntax
            • Lexer
            • Parser
            • Grammar-old
            • Grammar
            • Tokenizer
              • Tokenize-yul
              • Filter-and-reduce-lexeme-tree-to-subtoken-trees
              • Check-and-deref-tree-token?
              • Check-and-deref-tree-lexeme?
              • Tokenize-yul-bytes
              • Is-tree-rulename?
          • Static-soundness
          • Static-semantics
          • Errors
        • Yul-json
      • Zcash
      • Proof-checker-itp13
      • Regex
      • ACL2-programming-language
      • Json
      • Jfkr
      • Equational
      • Cryptography
      • Poseidon
      • Where-do-i-place-my-book
      • Axe
      • Bigmems
      • Builtins
      • Execloader
      • Aleo
      • Solidity
      • Paco
      • Concurrent-programs
      • Bls12-377-curves
    • Debugging
    • Std
    • Proof-automation
    • Macro-libraries
    • ACL2
    • Interfacing-tools
    • Hardware-verification
    • Software-verification
    • Math
    • Testing-utilities
  • Concrete-syntax

Tokenizer

An executable tokenizer of Yul.

This is a simple tokenizer for Yul code. The tokenizer simply lexes and then discards comments and whitespace.

The primary API for tokenizing is tokenize-yul and tokenize-yul-bytes.

Subtopics

Tokenize-yul
Lexes the bytes of yul-string into a list of tokens.
Filter-and-reduce-lexeme-tree-to-subtoken-trees
Sees through lexeme and token rules to return a list of keyword, literal, identifier, and symbol trees.
Check-and-deref-tree-token?
Check if the ABNF tree is a nonleaf for rule "token", extracting its component tree (keyword, literal, identifier, or symbol) if successful. If it is not successful, returns a reserrp.
Check-and-deref-tree-lexeme?
Check if the ABNF tree is a nonleaf for rule "lexeme", extracting its component tree (token, comment, or whitespace) if successful. If not successful, returns a reserrp.
Tokenize-yul-bytes
Lexes the bytes of a Yul source program into a list of tokens.
Is-tree-rulename?
True if tree is nonleaf for rule rulename-string.