The LLM is taking the order. Interpreting what people say into that simple text description. Not everyone talks the same or describes things the same. That is i believe where the bulk of the LLM is doing the work. Then I’m sure there is some background stock management and health checks out manages as well
Yeah, unlike a human that understands a customer saying “one pizzaburger, that’s all”, the app doesn’t understand the situation that the order is complete, but rather just keeps on asking more obviously unwanted cringey questions like “buy two, you’ll save a few cents on the second one?” or “what will you drink with that?” or “is that a big menu?”…
The LLM is taking the order. Interpreting what people say into that simple text description. Not everyone talks the same or describes things the same. That is i believe where the bulk of the LLM is doing the work. Then I’m sure there is some background stock management and health checks out manages as well
What’s wrong with an input machine with buttons or touch screen?
Takes too long to hold down the button for 18,000 waters.
not as hype-able to the csuites and ceos.
OT4G
(Order Time For Grandma)
We have apps for that, and they’re typically a pita. They certainly take longer than just talking through your order.
Yeah, unlike a human that understands a customer saying “one pizzaburger, that’s all”, the app doesn’t understand the situation that the order is complete, but rather just keeps on asking more obviously unwanted cringey questions like “buy two, you’ll save a few cents on the second one?” or “what will you drink with that?” or “is that a big menu?”…
Not futuristic enough or something.
They are not able to answer questions or change simply via a software update.