
Overview
Text Comprehend is a Natural Language Understanding solution that help users comprehend a passage of text. This is a state-of-the-art context aware, factoid model with bi-directional attention for comprehension. A deep contextualized embedding is used for distributed word representation. The output of the model will be a sub-string of words of variable length from the context passage.
Highlights
- A deep learning based model with attention which extracts insights to factoid inputs with respect to the context passage. Contextual embeddings are used for the distributed representation of the passage. The input context passage can have a maximum length of 1024 words.
- Text Comprehend can be used in document search engines to improve search, in factoid text based systems, and in building chatbots etc.
- Mphasis DeepInsights is a cloud-based cognitive computing platform that offers data extraction & predictive analytics capabilities. Need customized deep learning and machine learning solutions? Get in touch!
Details
Unlock automation with AI agent solutions

Features and programs
Financing for AWS Marketplace purchases
Pricing
Dimension | Description | Cost/host/hour |
|---|---|---|
ml.m5.large Inference (Batch) Recommended | Model inference on the ml.m5.large instance type, batch mode | $20.00 |
ml.c5.large Inference (Real-Time) Recommended | Model inference on the ml.c5.large instance type, real-time mode | $10.00 |
ml.m4.4xlarge Inference (Batch) | Model inference on the ml.m4.4xlarge instance type, batch mode | $20.00 |
ml.m5.4xlarge Inference (Batch) | Model inference on the ml.m5.4xlarge instance type, batch mode | $20.00 |
ml.m4.16xlarge Inference (Batch) | Model inference on the ml.m4.16xlarge instance type, batch mode | $20.00 |
ml.m5.2xlarge Inference (Batch) | Model inference on the ml.m5.2xlarge instance type, batch mode | $20.00 |
ml.p3.16xlarge Inference (Batch) | Model inference on the ml.p3.16xlarge instance type, batch mode | $20.00 |
ml.m4.2xlarge Inference (Batch) | Model inference on the ml.m4.2xlarge instance type, batch mode | $20.00 |
ml.c5.2xlarge Inference (Batch) | Model inference on the ml.c5.2xlarge instance type, batch mode | $20.00 |
ml.p3.2xlarge Inference (Batch) | Model inference on the ml.p3.2xlarge instance type, batch mode | $20.00 |
Vendor refund policy
Currently we do not support refunds, but you can cancel your subscription to the service at any time.
How can we make this page better?
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
Amazon SageMaker model
An Amazon SageMaker model package is a pre-trained machine learning model ready to use without additional training. Use the model package to create a model on Amazon SageMaker for real-time inference or batch processing. Amazon SageMaker is a fully managed platform for building, training, and deploying machine learning models at scale.
Version release notes
Bug Fixes and Performance Improvement
Additional details
Inputs
- Summary
• The input has to be a '.zip' file named as “Input.zip” which contains two text files : 1. passage.txt – contains passage whose length should be between 100 and 1024 words. 2. question.txt – contains question whose length should be of minmum 3 words • The text files should follow ‘utf-8’ encoding.
- Input MIME type
- application/zip, text/csv
Input data descriptions
The following table describes supported input data fields for real-time inference and batch transform.
Field name | Description | Constraints | Required |
|---|---|---|---|
passage.txt | This data contains passage whose length should be between 100 and 1024 words. | Type: FreeText | Yes |
question.txt | This data contains question whose length should be of minmum 3 words | Type: FreeText | Yes |
Resources
Vendor resources
Support
Vendor support
For any assistance reach out to us at:
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.
Similar products





Customer reviews
An AI-Powered Tool : Analyzer, Enhanced and Decisive details from unstructured text
Upsides of using it are:
Time Efficiency,
Actionable Insights,
Scalability,
Multilingual Support,
Improved Accuracy,
Integration Friendly
The downsides of using it are:
Customization Limits,
Costs are not as effecient as it should be becuase I found it bit expensive for processing large volumes of text,
Privacy are concerns for sensitive data
These functions are particularly beneficial for customer service, legal and healthcare industries, as I have quite a good experience in working in all these industries.