
Overview
The Universal Sentence Encoder encodes text into high-dimensional vectors that can be used for text classification, semantic similarity, clustering and other natural language tasks.
The model is trained and optimized for greater-than-word length text, such as sentences, phrases or short paragraphs. It is trained on a variety of data sources and a variety of tasks with the aim of dynamically accommodating a wide variety of natural language understanding tasks. The input is variable-length text and the output is a 512-dimensional vector.
The model supports text in 16 languages (Arabic, Chinese-simplified, Chinese-traditional, English, French, German, Italian, Japanese, Korean, Dutch, Polish, Portuguese, Spanish, Thai, Turkish, Russian). The language of the text input does not need to be specified.
Highlights
- Covers 16 languages, showing strong performance on cross-lingual retrieval.
- The model is intended to be used for text classification, text clustering, semantic textual similarity retrieval, cross-lingual text retrieval, etc.
- Developed by researchers at Google, 2019
Details
Unlock automation with AI agent solutions

Features and programs
Financing for AWS Marketplace purchases
Pricing
Free trial
Dimension | Description | Cost/host/hour |
|---|---|---|
ml.m5.xlarge Inference (Batch) Recommended | Model inference on the ml.m5.xlarge instance type, batch mode | $0.10 |
ml.m5.xlarge Inference (Real-Time) Recommended | Model inference on the ml.m5.xlarge instance type, real-time mode | $0.10 |
ml.m4.4xlarge Inference (Batch) | Model inference on the ml.m4.4xlarge instance type, batch mode | $0.10 |
ml.m5.4xlarge Inference (Batch) | Model inference on the ml.m5.4xlarge instance type, batch mode | $0.10 |
ml.m5.12xlarge Inference (Batch) | Model inference on the ml.m5.12xlarge instance type, batch mode | $0.10 |
ml.m5.large Inference (Batch) | Model inference on the ml.m5.large instance type, batch mode | $0.10 |
ml.m4.16xlarge Inference (Batch) | Model inference on the ml.m4.16xlarge instance type, batch mode | $0.10 |
ml.m5.2xlarge Inference (Batch) | Model inference on the ml.m5.2xlarge instance type, batch mode | $0.10 |
ml.m4.10xlarge Inference (Batch) | Model inference on the ml.m4.10xlarge instance type, batch mode | $0.10 |
ml.m5.24xlarge Inference (Batch) | Model inference on the ml.m5.24xlarge instance type, batch mode | $0.10 |
Vendor refund policy
Thank you for purchasing USE Embedding API on AWS Marketplace. We strive to ensure customer satisfaction with our services. If you have issues accessing the service contact us at support@aumlabs.aiÂ
How can we make this page better?
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
Amazon SageMaker model
An Amazon SageMaker model package is a pre-trained machine learning model ready to use without additional training. Use the model package to create a model on Amazon SageMaker for real-time inference or batch processing. Amazon SageMaker is a fully managed platform for building, training, and deploying machine learning models at scale.
Version release notes
Additional details
Inputs
- Summary
This model can analyze text passed as json or json text files stored in Amazon S3 bucket.
- Limitations for input type
- Maximum payload size for endpoint invocation is 5MB while the maximum payload size for batch inference is 100 MB.
- Input MIME type
- application/json
Input data descriptions
The following table describes supported input data fields for real-time inference and batch transform.
Field name | Description | Constraints | Required |
|---|---|---|---|
text | An array of text. The text can be in any of the supported languages: Arabic, Chinese-simplified, Chinese-traditional, English, French, German, Italian, Japanese, Korean, Dutch, Polish, Portuguese, Spanish, Thai, Turkish, Russian. | Type: FreeText | Yes |
Resources
Vendor resources
Support
Vendor support
AWS Infrastructure AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.
Similar products



Customer reviews
Game Changer
AUM labs transformed our approach to AI deployment. The seamless integration and the ability to run inference APIs in our private cloud was a game changer for us and further helped us keep our data secure.