gotosocial/vendor/github.com/tdewolff/parse/v2/html
kim 67100809b3 [chore] update dependencies (#4361)
- codeberg.org/gruf/go-kv/v2 v2.0.5 => v2.0.6
- github.com/coreos/go-oidc/v3 v3.14.1 => v3.15.0
- github.com/miekg/dns v1.1.67 => v1.1.68
- github.com/tdewolff/minify/v2 v2.23.9 => v2.23.11
- github.com/yuin/goldmark v1.7.12 => v1.7.13
- golang.org/x/crypto v0.40.0 => v0.41.0
- golang.org/x/image v0.29.0 => v0.30.0
- golang.org/x/net v0.42.0 => v0.43.0
- golang.org/x/sys v0.34.0 => v0.35.0
- golang.org/x/text v0.27.0 => v0.28.0
- modernc.org/sqlite v1.38.0 => v1.38.2

Reviewed-on: https://codeberg.org/superseriousbusiness/gotosocial/pulls/4361
Co-authored-by: kim <grufwub@gmail.com>
Co-committed-by: kim <grufwub@gmail.com>
2025-08-10 15:05:54 +02:00
..
hash.go [chore] update dependencies (#4361) 2025-08-10 15:05:54 +02:00
lex.go [chore] update dependencies (#4361) 2025-08-10 15:05:54 +02:00
README.md [bugfix] Markdown formatting updates (#743) 2022-08-07 18:19:16 +02:00
util.go [chore]: Bump github.com/tdewolff/minify/v2 from 2.20.16 to 2.20.17 (#2661) 2024-02-19 07:23:08 +00:00

HTML API reference

This package is an HTML5 lexer written in Go. It follows the specification at The HTML syntax. The lexer takes an io.Reader and converts it into tokens until the EOF.

Installation

Run the following command

go get -u github.com/tdewolff/parse/v2/html

or add the following import and run project with go get

import "github.com/tdewolff/parse/v2/html"

Lexer

Usage

The following initializes a new Lexer with io.Reader r:

l := html.NewLexer(parse.NewInput(r))

To tokenize until EOF an error, use:

for {
	tt, data := l.Next()
	switch tt {
	case html.ErrorToken:
		// error or EOF set in l.Err()
		return
	case html.StartTagToken:
		// ...
		for {
			ttAttr, dataAttr := l.Next()
			if ttAttr != html.AttributeToken {
				break
			}
			// ...
		}
	// ...
	}
}

All tokens:

ErrorToken TokenType = iota // extra token when errors occur
CommentToken
DoctypeToken
StartTagToken
StartTagCloseToken
StartTagVoidToken
EndTagToken
AttributeToken
TextToken

Examples

package main

import (
	"os"

	"github.com/tdewolff/parse/v2/html"
)

// Tokenize HTML from stdin.
func main() {
	l := html.NewLexer(parse.NewInput(os.Stdin))
	for {
		tt, data := l.Next()
		switch tt {
		case html.ErrorToken:
			if l.Err() != io.EOF {
				fmt.Println("Error on line", l.Line(), ":", l.Err())
			}
			return
		case html.StartTagToken:
			fmt.Println("Tag", string(data))
			for {
				ttAttr, dataAttr := l.Next()
				if ttAttr != html.AttributeToken {
					break
				}

				key := dataAttr
				val := l.AttrVal()
				fmt.Println("Attribute", string(key), "=", string(val))
			}
		// ...
		}
	}
}

License

Released under the MIT license.