Google BERT for Patents
contributors: Google Patents, Rob Srebrovic, Jay Yonamine
tags: classification, novelty, machine learning
terms of_use: http://www.apache.org/licenses/LICENSE-2.0
description: A BERT (bidirectional encoder representation from transformers) model pretrained on over 100 million patent publications from the U.S. and other countries using open-source tooling. The trained model can be used for a number of use cases, including how to more effectively perform prior art searching to determine the novelty of a patent application, automatically generate classification codes to assist with patent categorization, and autocomplete.
last edit: Fri, 01 Dec 2023 12:20:34 GMT