When interacting with a chatbot, there are not indications of what to say or how to say it. Without affordances, it’s difficult to know what to do at first.
Affordances are important. For example, you may have never interacted with a door before but you can usually figure out how it works. A flat panel makes it clear you should push while a handle indicates that you should pull.
Because large language models are reliant on text inputs (and text is universal and infinite), this presents a UX challenge. How do you indicate what the bot can and can’t do? How do you help them get the best answers without needing to understand it deeply?
Read Why Chatbots Are Not the Future.
See also:
- Designing for non-experts
- Taste is the refined sense of judgment and finding balance that produces a pleasing and integrated whole
Links to this note
-
Many vector databases can find the top
k
most similar results to a query but are unable to sort by other document metadata. This is a pretty severe limitation for building LLM applications, especially for ones where time is dimension (meetings, calendars, task lists, etc.). For example, retrieving the 10 most similar results to the phrase “team meeting notes” but not being able to retrieve the team meeting notes from the last month. -
Intent-Based Outcome Specification
A new paradigm for user interfaces is starting to take shape with the rise of AI powered tools. Rather than a loop of sending a command, receiving the output, and continuing (like graphical user interfaces), an intent-based outcome specification is telling the computer what the outcome should be—“open the door” instead of “check auth, unlock, open latch, extend door”.
-
LLM Latency Is Output-Size Bound
As it stands today, LLM applications have noticeable latency but much of the latency is output-size bound rather than input-size bound. That means the amount of text that goes into a prompt does not matter.
-
LLM Applications Need Creativity
Making the most of practical applications of large language models requires creativity. It’s a blank canvas to be filled in the same way that early mobile application developers faced when a new set of APIs unlocked new possibilities.