I think it is a problem. Maybe not for people like us, that understand the concept and its limitations, but “formal reasoning” is exactly how this technology is being pitched to the masses. “Take a picture of your homework and OpenAI will solve it”, “have it reply to your emails”, “have it write code for you”. All reasoning-heavy tasks.
On top of that, Google/Bing have it answering user questions directly, it’s commonly pitched as a “tutor”, or an “assistant”, the OpenAI API is being shoved everywhere under the sun for anything you can imagine for all kinds of tasks, and nobody is attempting to clarify it’s weaknesses in their marketing.
As it becomes more and more common, more and more users who don’t understand it’s fundamentally incapable of reliably doing these things will crop up.
Sounds like a CEO who doesn’t have a damn clue how code works. His description sounds like he thinks every line of code takes the same amount of time to execute, as if
x = 1;
takes as long as calling an encryption/decryption function.“Adding” code to bypass your encryption is obviously going to make things run way faster.