I have to say that I am having a very hard time with this because I don't quite understand what is exactly being sold and how it is going to be used. Are Anthropic and OpenAI selling the DoD a model (parameter weights and the like) or a chatbot program (the way ChatGPT and Gemini are sold to much of their consumers)? The scenario you mentioned is a drone querying Claude about whether a picture of a supposed adversary is actually an adversary -- but why couldn't the drone simply ask Claude for a description of the picture and then use a different program to make a decision about whether to fire? And if that happens, then yes, it is correct that Claude did not make the decision to fire but it is being used in a loop without a human, but just not making the final decision?
What is being sold (by OpenAI) is essentially API access.
Indeed, there are very complicated questions about how responsibility diffuses between systems that integrate components from multiple vendors. I don't think that makes it any safer though!
This is deeply insane
I have to say that I am having a very hard time with this because I don't quite understand what is exactly being sold and how it is going to be used. Are Anthropic and OpenAI selling the DoD a model (parameter weights and the like) or a chatbot program (the way ChatGPT and Gemini are sold to much of their consumers)? The scenario you mentioned is a drone querying Claude about whether a picture of a supposed adversary is actually an adversary -- but why couldn't the drone simply ask Claude for a description of the picture and then use a different program to make a decision about whether to fire? And if that happens, then yes, it is correct that Claude did not make the decision to fire but it is being used in a loop without a human, but just not making the final decision?
What is being sold (by OpenAI) is essentially API access.
Indeed, there are very complicated questions about how responsibility diffuses between systems that integrate components from multiple vendors. I don't think that makes it any safer though!