Wiro AI LogoWiro AI Logo
  • Home
  • Dashboard
  • Explore
  • Pricing
  • Blog
  • Documentation
  • Sign In
  • Sign Up
Wiro AI LogoWiro AI Logo
HomeDashboardExplorePricing
Blog
Documentation
Sign In
Sign Up

Task History

  • Runnings
  • Models
  • Trains
Select project...
The list is empty
No results

You don't have task yet.

Go to Models
Back to Explore

Models

View All
Relevance
  • Relevance
  • Newest
  • Most Rated
  • Most Commented
  • Highest Rated
Relevance
  • Relevance
  • Newest
  • Most Rated
  • Most Commented
  • Highest Rated

Wiro AI

Social Media & ViralEcommerceHuman ResourcesFast Inference

Video Generation

Text to VideoImage to VideoSpeech to VideoTalking Head

Image Generation

Text to ImageImage to ImageImage Editing

Audio & Speech

Text to SpeechSpeech to TextText to MusicText to SongImage to SongVideo to MusicVoice CloneRealtime Conversation

3D Generation

3D Generation

LLM & Chat (1)

ChatLLMPartner LLMRAGReasoning

Partners

WiroOpenAIGoogleByteDanceRunwayElevenlabsBlack Forest LabsKling-AIMiniMaxPixVerse

Wiro AI

Social Media & ViralEcommerceHuman ResourcesFast Inference

Video Generation

Text to VideoImage to VideoSpeech to VideoTalking Head

Image Generation

Text to ImageImage to ImageImage Editing

Audio & Speech

Text to SpeechSpeech to TextText to MusicText to SongImage to SongVideo to MusicVoice CloneRealtime Conversation

3D Generation

3D Generation

LLM & Chat

ChatLLMPartner LLMRAGReasoning

Partners

WiroOpenAIGoogleByteDanceRunwayElevenlabsBlack Forest LabsKling-AIMiniMaxPixVerse
rag-chat-github - AI Tool Cover Image
RAG

wiro/rag-chat-github

Instantly retrieve and analyze content from any GitHub repository. Select your LLM model, extract relevant information from codebases or documentation, and generate context-aware responses with ease!
0
rag-chat-youtube - AI Tool Cover Image
RAG

wiro/rag-chat-youtube

Extract insights directly from YouTube videos by simply providing a URL. Choose your LLM model, access video transcripts or summaries, and create contextually rich conversations effortlessly!
2
rag-chat-website - AI Tool Cover Image
RAG

wiro/rag-chat-website

Instantly retrieve and analyze content from any website URL. Select your LLM model, fetch key information from the page, and generate context-aware responses with ease!
0
rag-chat - AI Tool Cover Image
RAG

wiro/rag-chat

Wiro LLM Conversational RAG Tool – A unified interface for seamless RAG-based conversations. Select your preferred LLM model, retrieve relevant data from external sources, and generate context-aware responses effortlessly!
0
gpt-oss-20b - AI Tool Cover Image
RAG

openai/gpt-oss-20b

OpenAI GPT-OSS-20B model.
0
Qwen3-30B-A3B - AI Tool Cover Image
RAG

Qwen/Qwen3-30B-A3B

Qwen3-30B-A3B model.
0
Qwen3-30B-A3B-Thinking-2507 - AI Tool Cover Image
RAG

Qwen/Qwen3-30B-A3B-Thinking-2507

Qwen3-30B-A3B-Thinking-2507 model.
0
Qwen3-32B - AI Tool Cover Image
RAG

Qwen/Qwen3-32B

Qwen3-32B model.
0
Qwen3-Coder-30B-A3B-Instruct - AI Tool Cover Image
RAG

Qwen/Qwen3-Coder-30B-A3B-Instruct

Qwen3-Coder-30B-A3B-Instruct model.
0
Qwen2.5-14B-Instruct - AI Tool Cover Image
RAG

Qwen/Qwen2.5-14B-Instruct

Qwen2.5-14B-Instruct is a large language model by Alibaba’s Qwen team, optimized for instruction-following tasks with 14 billion parameters. It offers strong reasoning, multilingual capabilities, and efficient performance, making it suitable for chatbots, content creation, and various AI-driven applications.
0
Qwen2.5-32B-Instruct - AI Tool Cover Image
RAG

Qwen/Qwen2.5-32B-Instruct

Qwen2.5-32B-Instruct is a powerful large language model developed by Alibaba’s Qwen team, designed for instruction-following tasks with enhanced reasoning and natural language understanding capabilities. Optimized for efficiency and accuracy, it supports multi-turn conversations and complex queries, making it suitable for applications such as chatbots, content generation, and AI assistants.
0
DeepSeek-R1-Distill-Qwen-32B - AI Tool Cover Image
RAG

deepseek-ai/DeepSeek-R1-Distill-Qwen-32B

DeepSeek-R1-Distill-Qwen-32B is a distilled version of the DeepSeek-R1 model with 32 billion parameters, leveraging the Qwen architecture to deliver high-quality language understanding and generation. Through knowledge distillation, it retains the strengths of larger models while offering improved efficiency and reduced computational requirements. This model is ideal for large-scale AI applications that demand robust performance with optimized resource utilization.
0
DeepSeek-R1-Distill-Qwen-14B - AI Tool Cover Image
RAG

deepseek-ai/DeepSeek-R1-Distill-Qwen-14B

DeepSeek-R1-Distill-Qwen-14B is a distilled version of the DeepSeek-R1 model, featuring 14 billion parameters. Built on the Qwen architecture, it utilizes advanced knowledge distillation techniques to achieve a balance between high performance and computational efficiency. This model is well-suited for a wide range of natural language processing tasks, providing accurate and context-aware responses while optimizing resource consumption for deployment in production environments.
0
DeepSeek-R1-Distill-Llama-8B - AI Tool Cover Image
RAG

deepseek-ai/DeepSeek-R1-Distill-Llama-8B

DeepSeek-R1-Distill-Llama-8B is a distilled version of the DeepSeek-R1 model based on the LLaMA architecture, featuring 8 billion parameters. It offers a balance between performance and efficiency by leveraging knowledge distillation techniques to reduce computational costs while maintaining high-quality language processing capabilities. This model is ideal for applications that require powerful text generation and understanding with optimized resource usage.
0
DeepSeek-R1-Distill-Qwen-7B - AI Tool Cover Image
RAG

deepseek-ai/DeepSeek-R1-Distill-Qwen-7B

DeepSeek-R1-Distill-Qwen-7B is a distilled version of the DeepSeek-R1 model with 7 billion parameters, designed to provide high-quality language understanding while optimizing efficiency. Leveraging advanced knowledge distillation techniques, it retains the core capabilities of larger models with improved speed and lower resource consumption. This model is well-suited for tasks requiring robust natural language processing while maintaining cost-effective deployment.
0
DeepSeek-R1-Distill-Qwen-1.5B - AI Tool Cover Image
RAG

deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B

DeepSeek-R1-Distill-Qwen-1.5B is a distilled version of the DeepSeek-R1 model, designed to offer a balance between efficiency and performance. With 1.5 billion parameters, it leverages knowledge distillation techniques to retain the capabilities of larger models while optimizing for faster inference and reduced resource consumption. Ideal for applications requiring high-quality language understanding with lower computational overhead.
0
EuroLLM-9B-Instruct - AI Tool Cover Image
RAG

utter-project/EuroLLM-9B-Instruct

utter-project/EuroLLM-9B-Instruct is a 9 billion parameter multilingual AI model developed to understand and generate text across all European Union languages and additional relevant languages. It has been trained on a vast dataset of 4 trillion tokens and further fine-tuned on EuroBlocks to improve its performance in instruction-following and machine translation tasks.
0
EuroLLM-1.7B-Instruct - AI Tool Cover Image
RAG

utter-project/EuroLLM-1.7B-Instruct

utter-project/EuroLLM-1.7B-Instruct is a 1.7 billion parameter multilingual AI model designed to understand and generate text in all European Union languages and additional relevant languages. It has been trained on 4 trillion tokens from diverse sources and further instruction-tuned on EuroBlocks to enhance its capabilities in general instruction-following and machine translation.
0
Llama3-Med42-8B - AI Tool Cover Image
RAG

m42-health/Llama3-Med42-8B

m42-health/Llama3-Med42-8B is an open-access clinical large language model fine-tuned by M42, based on LLaMA-3, with 8 billion parameters. It is designed to provide high-quality, reliable answers to medical queries, enhancing access to medical knowledge.
0
Llama-3.2-3B-Instruct - AI Tool Cover Image
RAG

meta-llama/Llama-3.2-3B-Instruct

The Llama 3.2 collection of multilingual large language models (LLMs) is a collection of pretrained and instruction-tuned generative models in 1B and 3B sizes (text in/text out). The Llama 3.2 instruction-tuned text only models are optimized for multilingual dialogue use cases, including agentic retrieval and summarization tasks.
0
Wiro AI LogoWiro AI LogoLogo of nvidia programLogo of nvidia program
Wiro AI brings machine learning easily accessible to all in the cloud.
  • WIRO
  • About
  • Blog
  • Careers
  • Contact
  • Light Mode
  • Product
  • Explore
  • Pricing
  • Status
  • Documentation
  • Introduction
  • Start Your First Project
  • Example Projects

2026 © Wiro.ai | Terms of Service & Privacy Policy