Deployed on AWS
    Free Trial
    Supercharge your APIs beyond standard compression by reducing your data transfer fees, processing and transmission rates by 116x on average, without any refactoring or any loss of information! Currently supports JSON.

    Overview

    Play video

    Pipeline-D™ is a revolutionary data reduction solution designed to reduce data processing fees on the cloud beyond standard compression algorithms and beyond standard binary wire serialization formats. Designed to pay for itself and more while speeding up your data transmission rates, Pipeline-D™ offers lossless data reduction rates by a factor of 116x on average (validated using the openFDA dataset) with minimal footprint on any given cloud architecture.

    Powered by the patent-pending technologies ProtoSlicer™ and BitAtomizer™, Pipeline-D™ is able to compact machine data in real time to eliminate data waste at its source to reduce cloud data transfer and processing costs, and restore the data once arrived at its destination only when needed. This is achieved by exposing a simple API in each of your cloud environments to encode and decode any number of JSON objects required. As more data is processed within a session, the more overall reduction and efficiency improves!

    100% post-processing data integrity has been validated using multiple methodologies including the full processing of 70+ public datasets, and has been qualified by the Government of Canada for deployment on their infrastructure. Data security is guaranteed through a strong privacy-focused design, a solid implementation built in Rust at its core, and rigorous assessments by cybersecurity experts.

    Pipeline-D™ can also be used in conjunction with our proprietary encoding and decoding libraries available natively for Windows, Linux and web clients. Please contact sales@alpha-sanatorium.com  for more information.

    Highlights

    • Pipeline-D™ compacts compatible machine data by 116x on average, achieving better results than any other solutions on the market.
    • Pipeline-D™ is plug and play; it requires no programming or lengthy setup.
    • Pipeline-D™ is powered by the patent-pending technologies ProtoSlicer™ and BitAtomizer™, which splits and compacts your data optimally without the need of costly processing.

    Details

    Delivery method

    Supported services

    Delivery option
    Standard

    Latest version

    Operating system
    Linux

    Deployed on AWS

    Unlock automation with AI agent solutions

    Fast-track AI initiatives with agents, tools, and solutions from AWS Partners.
    AI Agents

    Features and programs

    Financing for AWS Marketplace purchases

    AWS Marketplace now accepts line of credit payments through the PNC Vendor Finance program. This program is available to select AWS customers in the US, excluding NV, NC, ND, TN, & VT.
    Financing for AWS Marketplace purchases

    Pricing

    Free trial

    Try this product free for 31 days according to the free trial terms set by the vendor. Usage-based pricing is in effect for usage beyond the free trial terms. Your free trial gets automatically converted to a paid subscription when the trial ends, but may be canceled any time before that.
    Pricing is based on actual usage, with charges varying according to how much you consume. Subscriptions have no end date and may be canceled any time.
    Additional AWS infrastructure costs may apply. Use the AWS Pricing Calculator  to estimate your infrastructure costs.

    Usage costs (1)

     Info
    Dimension
    Description
    Cost/unit/hour
    Hours
    Container Hours
    $0.10

    Vendor refund policy

    All fees are non-cancellable and non-refundable except as required by law.

    How can we make this page better?

    We'd like to hear your feedback and ideas on how to improve this page.
    We'd like to hear your feedback and ideas on how to improve this page.

    Legal

    Vendor terms and conditions

    Upon subscribing to this product, you must acknowledge and agree to the terms and conditions outlined in the vendor's End User License Agreement (EULA) .

    Content disclaimer

    Vendors are responsible for their product descriptions and other product content. AWS does not warrant that vendors' product descriptions or other product content are accurate, complete, reliable, current, or error-free.

    Usage information

     Info

    Delivery details

    Standard

    Supported services: Learn more 
    • Amazon ECS
    • Amazon EKS
    • Amazon ECS Anywhere
    • Amazon EKS Anywhere
    Container image

    Containers are lightweight, portable execution environments that wrap server application software in a filesystem that includes everything it needs to run. Container applications run on supported container runtimes and orchestration services, such as Amazon Elastic Container Service (Amazon ECS) or Amazon Elastic Kubernetes Service (Amazon EKS). Both eliminate the need for you to install and operate your own container orchestration software by managing and scheduling containers on a scalable cluster of virtual machines.

    Version release notes

    Initial general public version.

    Additional details

    Usage instructions

    Pipeline-D Setup

    • After AWS will grant you access to the private container image, use the command docker run <"AWS Container Image URI"> to launch the container
    • Deploy on two or more Linux containers in separate locations: one for the sender, one for the recipient
    • Use an encoder on the sender side and a decoder on the recipient side
    • For parallel sessions, deploy one container pair per session

    Important Notes

    • Runs in a closed and isolated cloud environment
    • Handles one request at a time
    • Assumes input is valid
    • Invalid data may cause data leakage, corruption, denial of service, or remote code execution

    Usage Overview

    • Listens on HTTP port 8080
    • Maintains a stateful session per sender and recipient
    • Each session builds a dictionary in memory to improve performance
    • Reset sessions to release memory when needed

    Message Flow

    • Sender resets encoder if needed
    • Sender sends message to encoder
    • Encoded message is sent to recipient
    • Recipient resets decoder if sender did
    • Recipient sends encoded message to decoder
    • Decoder sends the result to its final destination
    • Repeat steps two through five for the entire session

    API Reference

    • POST compress

    • Compresses JSON using internal methods

    • Input

    • Content-Type is application slash json

    • Body must be valid JSON

    • Output

    • Content-Type is application slash octet-stream

    • Body is compressed data

    • Ensure JSON is valid

    • No trailing commas

    • Strings must be properly escaped

    • Do not include NaN or Infinity values

    • POST decompress

    • Restores compressed data to original JSON

    • Input

    • Content-Type is application slash octet-stream

    • Body is encoded data

    • Output

    • Content-Type is application slash json

    • Body is the restored JSON

    • Decode messages in the same order as they were encoded

    • Do not mix data from different sessions

    • POST reset

    • GET reset

    • Resets the session and clears memory

    • No input required

    • Returns HTTP 200 OK

    • Can be used to confirm service is ready

    Support

    AWS infrastructure support

    AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.

    Similar products

    Customer reviews

    Ratings and reviews

     Info
    0 ratings
    5 star
    4 star
    3 star
    2 star
    1 star
    0%
    0%
    0%
    0%
    0%
    0 AWS reviews
    No customer reviews yet
    Be the first to review this product . We've partnered with PeerSpot to gather customer feedback. You can share your experience by writing or recording a review, or scheduling a call with a PeerSpot analyst.