Microsoft 365 Copilot trips over angle brackets, frustrating coders

Chatbot seems to choke when fed tricky less-than character Microsoft 365 Copilot appears to have developed an allergy to the less-than typographical symbol, which is preventing users from pasting HTML markup and programming code into the text area for Copilot prompts....

featured-image

Microsoft 365 Copilot appears to have developed an allergy to the less-than typographical symbol, which is preventing users from pasting HTML markup and programming code into the text area for Copilot prompts. Copilot is Microsoft's term for its generative AI service, and Microsoft 365 Copilot makes that service available within Redmond's productivity application suite. The AI assistant, however, appears to be having some trouble with text handling.

A Copilot user contacted The Register to alert us to the issue, noting the inability to use the less-than (." Were we to hazard a guess about why this is happening, we'd suggest it may have something to do with a poorly implemented content sanitization routine. Web forms often disallow the submission of characters used for HTML markup to prevent cross-site scripting attacks.



It may be that something similar was added to Copilot without recognizing how this might affect code-based input, or that the Copilot prompt input element is somehow inheriting web-based content filtering code from the underlying page. But really, you'd have to ask Microsoft. We did that, asking the tech giant to explain what's going on and whether a fix is forthcoming.

While our inquiry was acknowledged, we've not heard back. Perhaps the Copilot team is busy talking up AI software at Microsoft Ignite. ®.