tscholak/1zha5ono Fine tuned weights for PICARD Parsing Incrementally for Constrained Auto Regressive Decoding from Language Models based on t5.1.1.lm100k.base. Training Data The model has been fine t…
This model belongs to the Styleformer project Please refer to github page
Ælæctra A Step Towards More Efficient Danish Natural Language Processing Ælæctra is a Danish Transformer based language model created to enhance the variety of Danish NLP resources with a more efficie…
Model Card for Model ID <! Provide a quick summary of what the model is/does. Model Details Model Description <! Provide a longer summary of what this model is. This is the model card of a 🤗 transfor…
ONNX port of sentence transformers/all MiniLM L6 v2 adjusted to return attention weights. This model is intended to be used for BM42 searches. Usage Here's an example of performing inference using the…
ONNX port of sentence transformers/all MiniLM L6 v2 for text classification and similarity searches. Usage Here's an example of performing inference using the model with FastEmbed.
doc2query/all with prefix t5 base v1 This is a doc2query model based on T5 (also known as docT5query). It can be used for: Document expansion : You generate for your paragraphs 20 40 queries and index…
BART (large sized model), fine tuned on Amazon Reviews (English Language) The BART model was pre trained on the CNN DailyMail dataset, but it was re trained on the Amazon's Website Purchase that were…
AMRBART is a pretrained semantic parser which converts a sentence into an abstract meaning graph. You may find our paper here (Arxiv). The original implementation is avaliable here News 🎈 (2022/12/10…
Model Card for ance msmarco passage Pyserini is a Python toolkit for reproducible information retrieval research with sparse and dense representations. Model Details Model Description Pyserini is prim…
Version: 2 Release Date: April 23, 2024 Intended Use This model is designed for the specific task of question answering (Q&A) in Italian. It is intended for applications that require understanding and…
AraELECTRA <img src="https://raw.githubusercontent.com/aub mind/arabert/master/AraELECTRA.png" width="100" align="left"/ ELECTRA is a method for self supervised language representation learning. It ca…
AraT5 base AraT5: Text to Text Transformers for Arabic Language Generation <img src="https://huggingface.co/UBC NLP/AraT5 base/resolve/main/AraT5 CR new.png" alt="AraT5" width="45%" height="35%" align…
AraT5v2 base 1024 What's new? More Data. is trained on large and more diverse Arabic data. Larger Sequence Length. We increase the sequence length from 512 to 1024 in this version. Faster Convergence.…
Article Summarizer The Article Summarizer is a fine tuned version of the Facebook BART Large CNN model, specifically optimized for generating more detailed and informative news summaries. Unlike the b…
Article Title Generator The model is based on the T5 language model and trained using a large collection of Medium articles. Usage Example code: License MIT
ATT&CK BERT: a Cybersecurity Language Model ATT&CK BERT is a cybersecurity domain specific language model based on sentence transformers. ATT&CK BERT maps sentences representing attack actions to a se…
This model is used by Python package audio denoiser.
Autoformer Overview The Autoformer model was proposed in Autoformer: Decomposition Transformers with Auto Correlation for Long Term Series Forecasting by Haixu Wu, Jiehui Xu, Jianmin Wang and Mingshen…