Translators are not AI usage. It is a different system. They should have used another term to describe the usage of llm generative type things, instead of AI. Artificial Intelligence has several established meanings and now "AI" confuses people. What is meant by it?
Anything that uses a "prompt" to generate a thing is "AI" in context.
Translating something is not a prompt to generate something. You can use an AI prompt to do so, but that's like cutting bread by baking a new slice of bread.
In apps like photoshop, the "prompt" might not be a text prompt. But the concept applies. You tell the system to generate something and the black box doing so uses training data to simulate and predict some creative outcome. The outcome might be nonsensical, but will look like the expected outcome from afar.
I would like to see additional “Some AI may be used somewhere” tag, for neutrally-minded developers and users.
That would not be for the neutral minded. It would be for the "vegan" minded.
There is claims that maybe a quarter of new code in major software is "AI" generated. So the very browser people play a game with would be contaminated. And one can expect that game engines will also have a function here, a template there being done with help of AI. Like your example with googling calls. The examples one rehashes might have been AI written.
Rehashing code is standard, so what AI does, is not so uncommon for coders, as it is for visual artists. You just do not rehash other people's art without some backlash. There is "tracing". Meanwhile, coders reuse whole libraries, as that's the point of libraries. Artists do not do that. They do not copy paste the picture of a table, the picture of a glass bowl and pictures of different fruit to "paint" something like that https://en.wikipedia.org/wiki/Still_life