decorative thumbnail

Google BERT for Patents


contributors: Google Patents, Rob Srebrovic, Jay Yonamine

tags: classification, novelty, machine learning

terms of_use:


description: A BERT (bidirectional encoder representation from transformers) model pretrained on over 100 million patent publications from the U.S. and other countries using open-source tooling. The trained model can be used for a number of use cases, including how to more effectively perform prior art searching to determine the novelty of a patent application, automatically generate classification codes to assist with patent categorization, and autocomplete.

last edit: Fri, 01 Dec 2023 12:20:34 GMT