Microsoft Copilot for Microsoft 365的数据、隐私和安全
LLM插件目录
The following plugins are available for LLM. Here’s how to install them.
Local models
These plugins all help you run LLMs directly on your own computer:
-
llm-llama-cpp uses llama.cpp to run models published in the GGUF format.
- 阅读更多 关于 LLM插件目录
- 登录 发表评论
建筑LLM供电产品第2部分
建筑LLM赋能产品第1部分
For the past 6 months, I’ve been working on LLM-powered applications using GPT and other AI-as-a-Service providers. Along the way, I produced a set of illustrations to help visualize and explain some general architectural concepts.
Below is the first batch, I’m hoping to add more later on.
构建LLM驱动的应用程序:您需要了解的内容
Building LLM-powered Applications
The past few weeks have been exciting for developers interested in deploying AI-powered applications. The field is evolving quickly, and it is now possible to build AI-powered applications without having to spend months or years learning the ins and outs of machine learning. This opens up a whole new world of possibilities, as developers can now experiment with AI in ways that were never before possible.
在软件应用程序中利用大型语言模型
How can you leverage the capabilities of Large Language Models (LLMs) within your software applications?
You cannot simply create a thin application layer above an LLM API. Instead you need to design and build a number of components to ‘tame’ the underlying models and also to differentiate your product.
了解如何使用AWS SageMaker JumpStart Foundation Models使用LLM代理构建和部署工具
Large language model (LLM) agents are programs that extend the capabilities of standalone LLMs with 1) access to external tools (APIs, functions, webhooks, plugins, and so on), and 2) the ability to plan and execute tasks in a self-directed fashion. Often, LLMs need to interact with other software, databases, or APIs to accomplish complex tasks. For example, an administrative chatbot that schedules meetings would require access to employees’ calendars and email. With access to tools, LLM agents can become more powerful—at the cost of additional complexity.
LLM支持的OLAP:使用Apache Doris的腾讯应用程序
Six months ago, I wrote about why we replaced ClickHouse with Apache Doris as an OLAP engine for our data management system. Back then, we were struggling with the auto-generation of SQL statements. As days pass, we have made progresses big enough to be references for you (I think), so here I am again.
构建可扩展的大型语言模型(LLM)应用程序
🚀 People ask me what it takes to build a scalable Large Language Model (LLM) app in 2023. When we talk about scalable we mean 100s of users and millisecond latency. Let me share some of our learning experience with you.