Overview
Pipeline-D™ Overview
Pipeline-D™ is a revolutionary data reduction solution designed to reduce data processing fees on the cloud beyond standard compression algorithms and beyond standard binary wire serialization formats. Designed to pay for itself and more while speeding up your data transmission rates, Pipeline-D™ offers lossless data reduction rates by a factor of 116x on average (validated using the openFDA dataset) with minimal footprint on any given cloud architecture.
Powered by the patent-pending technologies ProtoSlicer™ and BitAtomizer™, Pipeline-D™ is able to compact machine data in real time to eliminate data waste at its source to reduce cloud data transfer and processing costs, and restore the data once arrived at its destination only when needed. This is achieved by exposing a simple API in each of your cloud environments to encode and decode any number of JSON objects required. As more data is processed within a session, the more overall reduction and efficiency improves!
100% post-processing data integrity has been validated using multiple methodologies including the full processing of 70+ public datasets, and has been qualified by the Government of Canada for deployment on their infrastructure. Data security is guaranteed through a strong privacy-focused design, a solid implementation built in Rust at its core, and rigorous assessments by cybersecurity experts.
Pipeline-D™ can also be used in conjunction with our proprietary encoding and decoding libraries available natively for Windows, Linux and web clients. Please contact sales@alpha-sanatorium.com for more information.
Highlights
- Pipeline-D™ compacts compatible machine data by 116x on average, achieving better results than any other solutions on the market.
- Pipeline-D™ is plug and play; it requires no programming or lengthy setup.
- Pipeline-D™ is powered by the patent-pending technologies ProtoSlicer™ and BitAtomizer™, which splits and compacts your data optimally without the need of costly processing.
Details
Unlock automation with AI agent solutions

Features and programs
Financing for AWS Marketplace purchases
Pricing
Free trial
Dimension | Description | Cost/unit/hour |
---|---|---|
Hours | Container Hours | $0.10 |
Vendor refund policy
All fees are non-cancellable and non-refundable except as required by law.
How can we make this page better?
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
Standard
- Amazon ECS
- Amazon EKS
- Amazon ECS Anywhere
- Amazon EKS Anywhere
Container image
Containers are lightweight, portable execution environments that wrap server application software in a filesystem that includes everything it needs to run. Container applications run on supported container runtimes and orchestration services, such as Amazon Elastic Container Service (Amazon ECS) or Amazon Elastic Kubernetes Service (Amazon EKS). Both eliminate the need for you to install and operate your own container orchestration software by managing and scheduling containers on a scalable cluster of virtual machines.
Version release notes
Initial general public version.
Additional details
Usage instructions
Pipeline-D Setup
- After AWS will grant you access to the private container image, use the command docker run <"AWS Container Image URI"> to launch the container
- Deploy on two or more Linux containers in separate locations: one for the sender, one for the recipient
- Use an encoder on the sender side and a decoder on the recipient side
- For parallel sessions, deploy one container pair per session
Important Notes
- Runs in a closed and isolated cloud environment
- Handles one request at a time
- Assumes input is valid
- Invalid data may cause data leakage, corruption, denial of service, or remote code execution
Usage Overview
- Listens on HTTP port 8080
- Maintains a stateful session per sender and recipient
- Each session builds a dictionary in memory to improve performance
- Reset sessions to release memory when needed
Message Flow
- Sender resets encoder if needed
- Sender sends message to encoder
- Encoded message is sent to recipient
- Recipient resets decoder if sender did
- Recipient sends encoded message to decoder
- Decoder sends the result to its final destination
- Repeat steps two through five for the entire session
API Reference
-
POST compress
-
Compresses JSON using internal methods
-
Input
-
Content-Type is application slash json
-
Body must be valid JSON
-
Output
-
Content-Type is application slash octet-stream
-
Body is compressed data
-
Ensure JSON is valid
-
No trailing commas
-
Strings must be properly escaped
-
Do not include NaN or Infinity values
-
POST decompress
-
Restores compressed data to original JSON
-
Input
-
Content-Type is application slash octet-stream
-
Body is encoded data
-
Output
-
Content-Type is application slash json
-
Body is the restored JSON
-
Decode messages in the same order as they were encoded
-
Do not mix data from different sessions
-
POST reset
-
GET reset
-
Resets the session and clears memory
-
No input required
-
Returns HTTP 200 OK
-
Can be used to confirm service is ready
Resources
Vendor resources
Support
Vendor support
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.