Google BERT for Patents
location: https://github.com/google/patents-public-data/blob/master/models/BERT%20for%20Patents.md
contributors: Google Patents, Rob Srebrovic, Jay Yonamine
tags: classification, novelty, machine learning
terms of_use: http://www.apache.org/licenses/LICENSE-2.0
documentation: https://github.com/google/patents-public-data/blob/master/examples/BERT_For_Patents.ipynb
description: A BERT (bidirectional encoder representation from transformers) model pretrained on over 100 million patent publications from the U.S. and other countries using open-source tooling. The trained model can be used for a number of use cases, including how to more effectively perform prior art searching to determine the novelty of a patent application, automatically generate classification codes to assist with patent categorization, and autocomplete.
last edit: Fri, 01 Dec 2023 12:20:34 GMT