Sign in
Categories
Your Saved List Become a Channel Partner Sell in AWS Marketplace Amazon Web Services Home Help

Hugging Face Hub

Hugging Face | 1

Reviews from AWS customer

0 AWS reviews
  • 5 star
    0
  • 4 star
    0
  • 3 star
    0
  • 2 star
    0
  • 1 star
    0

External reviews

13 reviews
from

External reviews are not included in the AWS star rating for the product.


    KhasimMirza

Extensive documentation and diverse models support AI-driven projects

  • April 18, 2025
  • Review provided by PeerSpot

What is our primary use case?

I am working on AI with various large language models for different purposes such as medicine and law, where they are fine-tuned with specific requirements. I download LLMs from Hugging Face for these environments. I use it to support AI-driven projects and deploy AI applications for local use, focusing on local LLMs with real-world applications.

What is most valuable?

Hugging Face is valuable because it provides a single, comprehensive repository with thorough documentation and extensive datasets. It hosts nearly 400,000 open-source LLMs that cover a wide variety of tasks, including text classification, token classification, text generation, and more. It serves as a foundational platform offering updated resources, making it essential in the AI community.

What needs improvement?

It is challenging to suggest specific improvements for Hugging Face, as their platform is already very well-organized and efficient. However, they could focus on cleaning up outdated models if they seem unnecessary and continue organizing more LLMs.

For how long have I used the solution?

I have been working with Hugging Face for about one and a half years.

What do I think about the stability of the solution?

Hugging Face is stable, provided the environment is controlled, and the user base is limited. The stability relies on the specific models and the data they're fed, which minimizes issues like hallucination.

What do I think about the scalability of the solution?

Hugging Face is quite scalable, especially in terms of upgrading models for better performance. There is flexibility in using models of varying sizes while keeping the application environment consistent.

How are customer service and support?

I have not needed to communicate with Hugging Face's technical support because they have extensive documentation available.

How would you rate customer service and support?

Neutral

Which solution did I use previously and why did I switch?

Before Hugging Face, I used Ollama due to its ease of use, but Hugging Face offers a wider range of models.

How was the initial setup?

The initial setup can be rated as a seven out of ten due to occasional issues during model deployment, which might require adjustments. Recent developments have made the process easier though.

What's my experience with pricing, setup cost, and licensing?

The pricing is reasonable. I use a pro account, which costs about $9 a month. This positions it in the middle of the cost scale.

Which other solutions did I evaluate?

Before choosing Hugging Face, I used Ollama for its ease of use, but it lacked the variety offered by Hugging Face.

What other advice do I have?

Overall, the platform is excellent. For any AI enthusiast, Hugging Face provides a broad array of open-source models and a solid foundation for building AI applications. Using an on-premises model helps manage errors in critical environments. I rate Hugging Face as an eight out of ten.

Which deployment model are you using for this solution?

On-premises


    SwaminathanSubramanian

Versatility empowers AI concept development despite the multi-GPU challenge

  • February 05, 2025
  • Review provided by PeerSpot

What is our primary use case?

I have been using Hugging Face for proof of concepts (POC) and a generative AI project. Currently, I'm trying to use it with Tala and Olaama, along with some other AI tools as I build up my knowledge of AI and generative AI.

What is most valuable?

I like that Hugging Face is versatile in the way it has been developed. I appreciate the versatility and the fact that it has generalized many models. I'm exploring other solutions as well, however, I find Hugging Face very user-friendly. 

I am still building my knowledge of it. From my perspective, it's very easy to use, and as you ramp up, you discover new aspects about it.

What needs improvement?

Regarding scalability, I'm finding the multi-GPU aspect of it challenging. Training the model is another hurdle, although I'm only getting into that aspect currently. Organizations are apprehensive about investing in multi-GPU setups. 

Additionally, data cleanup is a challenge that needs to be resolved, as data must be mature and pristine.

For how long have I used the solution?

I have been using it for a total of around six months.

What do I think about the stability of the solution?

I have not really faced any stability issues, however, the scale has been small. I'm unsure how it would perform on a larger scale.

What do I think about the scalability of the solution?

I have not had production-type deployments for a client yet. Organizations are not mature enough to invest significantly in multi-GPU setups, which presents a scalability challenge. Also, organizations are apprehensive about the multi-GPU route.

How are customer service and support?

I have not contacted their support team yet.

How would you rate customer service and support?

Neutral

What's my experience with pricing, setup cost, and licensing?

I am just a user at this point and do not have information about their pricing.

Which other solutions did I evaluate?

I'm exploring Langchain and Agentic AI as part of my current learning and development.

What other advice do I have?

Joining the Hugging Face community can provide additional support. It allows for collaboration on models and datasets, offering quick insights on how the community is using it. 

I rate the solution a seven out of ten.

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Other


    Melek Ghouma

Accessible inference APIs drive personal project success for students

  • January 24, 2025
  • Review provided by PeerSpot

What is our primary use case?

This is a simple personal project, non-commercial. As a student, that's all I do.

What is most valuable?

The most valuable features are the inference APIs as it takes me a long time to run inferences on my local machine.

What needs improvement?

Access to the models and datasets could be improved. Many interesting ones are restricted. It would be great if they provided access for students or non-professionals who just want to test things.

For how long have I used the solution?

I have been using this solution for about the last three or four months.

Which solution did I use previously and why did I switch?

I have used just TensorFlow and PyTorch. Nothing else.

What's my experience with pricing, setup cost, and licensing?


What other advice do I have?

I've been trying to implement some chatbots, and having free access to Hugging Face helped me a lot. 

I use PyTorch and TensorFlow to implement other deep-learning models and access LLMs. Each one of these tools has its own purpose. Python is used for deep learning projects to train and fine-tune models at the deep learning level, while for Hugging Face, it's mainly for the transformers library and LLM APIs. I cannot compare them directly. For me, it's about access to datasets and models. 

I would rate this product nine out of ten. 


    Vikas_Gupta

Easy to use, but initial configuration can be a bit challenging

  • September 04, 2024
  • Review provided by PeerSpot

What is our primary use case?

We use the tool to extract data from a PDF file, give the text data to any Hugging Face model like Meta or Llama, and get the results from those models according to the prompt. It's basically like having a chat with the PDF file.

What is most valuable?

The solution is easy to use compared to other frameworks like PyTorch and TensorFlow.

What needs improvement?

Initially, I faced issues with the solution's configuration.

For how long have I used the solution?

I have been using Hugging Face for almost two years.

What do I think about the stability of the solution?

Hugging Face is a stable solution.

What do I think about the scalability of the solution?

Hugging Face is a scalable solution.

What other advice do I have?

To use Hugging Face, you need to have basic knowledge of how to feed the data, how to speed data, how to train the model, and how to evaluate the model. Compared to other frameworks like PyTorch and TensorFlow, I'm more comfortable with using Hugging Face. I would recommend the solution to other users.

Overall, I rate the solution seven and a half out of ten.


    Devendra (Dev) Mandloi

Open-source, reliable, and easy to learn

  • August 08, 2024
  • Review provided by PeerSpot

What is our primary use case?

I had to perform training on a model when I worked as a data scientist. There is already a pre-trained model, and we train our model on our custom data. We can accept things from this pre-trained model that has already been trained on a huge amount of data.

What is most valuable?

Hugging Face provides open-source models, making it the best open-source and reliable solution. Currently, Hugging Face is the best solution for exploring many models. There are several models that we can use in real life. There are several words, and we can use a Hugging Face model like NER to accept only limited words from a text.

What needs improvement?

Most people upload their pre-trained models on Hugging Face, but more details should be added about the models.

For how long have I used the solution?

I have been using Hugging Face for six months.

What do I think about the stability of the solution?

The solution provides good stability.

What do I think about the scalability of the solution?

Five people from our team totally depend on the Hugging Face model whenever the company gets a new project.

What's my experience with pricing, setup cost, and licensing?

Hugging Face is an open-source solution.

What other advice do I have?

The solution is deployed on the cloud in our organization. Hugging Face provides many open-source models like Meta and Gemma that are performing very well. When someone puts their model on Hugging Face, they provide us with all the steps. We can follow those steps and train our model. This is the best thing I have seen by Hugging Face.

Several IT industries in India are unable to purchase models like ChatGPT. Hugging Face provides open-source models, making it the best open-source and reliable solution. I would recommend the solution to other users. Users can easily use Hugging Face after watching YouTube videos on how to use it. It is easy to learn to use Hugging Face.

Overall, I rate the solution an eight out of ten.


    Mustafa Kurt

Available at a low cost

  • August 01, 2024
  • Review provided by PeerSpot

What is our primary use case?

We use the solution for fine-tuning and RAC and LLM.

What is most valuable?

The most important feature is Secure LMM because there are so many NLMs that manage programs on the Internet.

What needs improvement?

It can incorporate AI into its services.

For how long have I used the solution?

I have been using Hugging Face for six months.

What do I think about the stability of the solution?

It is stable.

What do I think about the scalability of the solution?

It is scalable.

How was the initial setup?

Deployment can be challenging, but it becomes more manageable with the right education or by watching a tutorial. Many data science students might find it difficult to use. They need to learn about LLMs.

Since we have learned, we can use it easily. It takes two to three hours to deploy.

What's my experience with pricing, setup cost, and licensing?

It has reasonable pricing, which is six dollars per month.

What other advice do I have?

Integration is very easy.

Overall, I rate the solution an eight out of ten.


    AshishKumar11

Open-sourced, reliable, and enables organizations to finetune data for business requirements

  • July 25, 2024
  • Review provided by PeerSpot

What is our primary use case?

Hugging Face is a website that provides various open-source models. We use them to finetune models for our business. It is just like ChatGPT, but ChatGPT has paid sources. If we have to call an API, we must pay for it. However, Hugging Face has various open-source models like Llama 2 and Llama 3 that provide similar functionalities to ChatGPT. We use Llama 2 with 6 billion parameters to finetune the data for our business.

What is most valuable?

The tool is available for free. We use the product because it is beneficial for the company. It reduces cost. The product is reliable.

What needs improvement?

The solution must provide an efficient LLM. Facebook provides Llama 3, which gives results similar to ChatGPT. For now, Facebook is ChatGPT’s only competition. Hugging Face must provide a similar product.

For how long have I used the solution?

I have been using the solution for two to three months.

What do I think about the stability of the solution?

Facebook provides llama 3. Hugging Face is just a pathway. We have not found any bugs in the last two months.

What do I think about the scalability of the solution?

Five AI engineers in our organization were using the solution.

How was the initial setup?

The installation is easy if the computer or laptop has good hardware, RAM, and NVIDIA graphics card. If a system has a low RAM, the installation will be difficult.

What's my experience with pricing, setup cost, and licensing?

We do not have to pay for the product.

Which other solutions did I evaluate?

Various closed-source models like ChatGPT charge us for every call we make. For example, if I make a call in ChatGPT, it will cost us $20. Hugging Face is an open-source model. It doesn’t charge anything. ChatGPT has better functionalities than other open-source tools. However, I think open-source products will increase their functionalities in the future and compete with OpenAI.

What other advice do I have?

I will recommend the solution to people. It is the only platform that provides open-source models. Once we understand the LLM, it will be easy to use the tool. The open-source community has limited resources. It is increasing, though.

Overall, I rate the solution a nine out of ten.


    Rohit Patel

An open-source platform that has hundreds of packages for creating LLMs

  • July 25, 2024
  • Review provided by PeerSpot

What is our primary use case?

In my last project, I created an SQL chatbot to convert simple English requests to complex SQL queries. As you know, computers don't understand textual data, so we have to tokenize it. I used Hugging Face embeddings for that.

What is most valuable?

The tool's most valuable feature is that it's open-source and has hundreds of packages already available. This makes it quite helpful for creating our LLMs.

What needs improvement?

I've worked on three projects using Hugging Face, and only once did we encounter a problem with the code. We had to use another open-source embedding from OpenAI to resolve it. Our team has three members: me, my colleague, and a team leader. We looked at the problem and resolved it.

The solution offers numerous modules that can be loaded onto personal machines or local servers for use in Python or other programming environments. However, the instructions on how to use these modules are not detailed enough.

For how long have I used the solution?

I have been using the product for two months. 

How are customer service and support?

I haven't contacted the solution's support team yet. 

How was the initial setup?

You can download the packages and connect them to an external source. 

What's my experience with pricing, setup cost, and licensing?

The solution is open source. 

What other advice do I have?

I'm learning generative AI, and there's a course on the DeepLearning.AI platform on which to learn AI with Hugging Face. That's where I learned about Hugging Face. I found it very easy to load the packages for Hugging Face to do our work, so I used it. Anyone with basic knowledge of coding can use it.

I rate the overall product an eight out of ten. 

Which deployment model are you using for this solution?

On-premises


    Neeraj Pokala

An open-source solution that helps to fine-tune large language models

  • July 24, 2024
  • Review provided by PeerSpot

What is our primary use case?

I use Hugging Face to fine-tune large language models. We take our client's use case and an open-source model already deployed, download the model artifacts, and fine-tune the models according to our specific use case.

What is most valuable?

The tool's most valuable feature is that it shows trending models. All the new models, even Google's demo models, appear at the top. You can find all the open-source models in one place. You can use them directly and easily find their documentation. It's very simple to find documentation and write code. If you want to work with AI and machine learning, Hugging Face is a perfect place to start.

What needs improvement?

I believe Hugging Face has some room for improvement. There are some security issues. They provide code, but API tokens aren't indicated. Also, the documentation for particular models could use more explanation. But I think these things are improving daily. The main change I'd like to see is making the deployment of inference endpoints more customizable for users.

For how long have I used the solution?

I have been using the product for a year. 

What do I think about the stability of the solution?

I think Hugging Face is a good, stable product. I don't see any major bugs or breakdowns. The entire company is working to bring all open-source libraries onto one platform. Many companies use it to deploy their large language models for generative AI. It's a good platform, and I don't hear many complaints about it.

What do I think about the scalability of the solution?

I estimate that this product will have around 20,000 to 30,000 users. It is revolutionary.

How are customer service and support?

We contact support through emails. 

Which solution did I use previously and why did I switch?

We chose the solution because it helped us reduce costs. The same model would generate costs elsewhere. 

How was the initial setup?

We have two deployment options: cloud and on-premises. On-premises means it's on-demand, and we have to monitor it. With cloud deployment, there's no need to watch for availability because it's always handled in the cloud. There should be no problems with cloud deployment. If we deploy on-premises, we have to monitor it ourselves. That's the main difference. We have both options available.

It's very easy to deploy an endpoint because there's already pre-built documentation. With just one click, you can directly load the knowledge handler. The challenging part is determining if the model suits our customized use case, which takes time. Once we're sure the model is right for our use case, it's straightforward.

What's my experience with pricing, setup cost, and licensing?

The tool is open-source. The cost depends on what task you're doing. If you're using a large language model with around 12 million parameters, it will cost more. On average, Hugging Face is open source so you can download models to your local machine for free. For deployment, you can use any cloud service.

What other advice do I have?

You can start with it on a personal device. If you're planning to deploy, you might want to consider integrating Hugging Face with a cloud platform. This can help reduce charges, and the deployment will happen on the cloud platform.

If you're joining our team and using this tool for the first time, you'll need some experience deploying models. Hugging Face is one platform where you can deploy open-source models. You should have six or seven months of experience handling large language models. After that, you can learn the basic documentation in two or three days.

I rate it an eight out of ten. 


    Neeraj Maurya

Leveraging open models for tailored AI with room for easier model fine-tuning

  • May 28, 2024
  • Review provided by PeerSpot

What is our primary use case?

I use Hugging Face primarily to work with open LLM models. I recently started using the open LOM models and also use embedding models. I use these models to train custom data and monitor our desktop custom models after training and deployment.

What is most valuable?

The most valuable features of Hugging Face are the embedding models and the open LLM Maurya. There are numerous libraries available, and the documentation is rich and step-by-step, helping us understand which model to use in particular conditions.

What needs improvement?

Hugging Face could improve by implementing a search engine or chat bot feature similar to ChatGPT. This would aid developers in easily finding how to fine-tune models with specific data or get model recommendations for their data.

For how long have I used the solution?

I have been using Hugging Face for the last five years.

How are customer service and support?

One time, I submitted a support ticket concerning the fine-tuning of models. I was happy with the response to my query.

How would you rate customer service and support?

Neutral

How was the initial setup?

Initial setup can be challenging since it's not just dependent on Hugging Face but also on the overall architecture, whether you're using Kubernetes or Docker.

What's my experience with pricing, setup cost, and licensing?

If you are implementing for product services, it is a bit costly, especially when using open LLM models due to high machine and GPU requirements.

What other advice do I have?

Hugging Face is suitable if you are serious about your product and want to keep your data private instead of using peer services. It's good for learning and exploring AI models.

Which deployment model are you using for this solution?

On-premises