Tokenizes the current Document in the background, and caches the tokenized rows for future use. If a certain row is changed, everything below that row is re-tokenized.

Constructors

Creates a new BackgroundTokenizer object.

   

Creates a new BackgroundTokenizer object.

Arguments

tokenizerTokenizerRequired. The tokenizer to use
editorEditorRequired. The editor to associate with

Methods

    • BackgroundTokenizer.fireUpdateEvent(Number firstRow, Number lastRow)

    Emits the 'update' event. firstRow and lastRow are used to define the boundaries of the region to be updated.

       

    Emits the 'update' event. firstRow and lastRow are used to define the boundaries of the region to be updated.

    Arguments

    firstRowNumberRequired. The starting row region
    lastRowNumberRequired. The final row region

    Returns the state of tokenization at the end of a row.

       

    Returns the state of tokenization at the end of a row.

    Arguments

    rowNumberRequired. The row to get state at

    Gives list of tokens of the row. (tokens are cached)

       

    Gives list of tokens of the row. (tokens are cached)

    Arguments

    rowNumberRequired. The row to get tokens at
      • BackgroundTokenizer.setDocument(Document doc)

      Sets a new document to associate with this object.

         

      Sets a new document to associate with this object.

      Arguments

      docDocumentRequired. The new document to associate with
        • BackgroundTokenizer.setTokenizer(Tokenizer tokenizer)

        Sets a new tokenizer for this object.

           

        Sets a new tokenizer for this object.

        Arguments

        tokenizerTokenizerRequired. The new tokenizer to use
          • BackgroundTokenizer.start(Number startRow)

          Starts tokenizing at the row indicated.

             

          Starts tokenizing at the row indicated.

          Arguments

          startRowNumberRequired. The row to start at
            • BackgroundTokenizer.stop()

            Stops tokenizing.

               

            Stops tokenizing.