...
The biggest problem, as I see it, is the associated dumbing down of the audience that you refer to. A lot of younger translators are now forced to use machine translations (for cost reasons) but they just don't see the mistakes the machine makes. So over time a wrong translation becomes standard. ...
...
That is the situation that all businesses face right now, and it's a bit of a chicken and egg. It's very hard to find financial, tax, and accounting staff who are interested in "knowing" the rules and reasons. They want the answers. Quickly. As easy as possible. Maybe the biggest change isn't work ethic, so much as a total loss of inquisitive nature. Why or how doesn't matter...just what.
For now, more senior staff can fill in the holes and avoid the breakage, but that's going to end soon. When nobody understands why things were done, how will they ever adapt to the constantly changing universe?
It may all end up OK, because when they can't explain things (like the US tax code), maybe they just start over and by default remove all of the subtle problems that have built up over time, but it's a huge risk.
As negative as I and others can be about AI and its use, it is an amazingly exciting time to be working. I've lived through the migration from analog to digital in finance, and wish I could stick around long enough to fix some of the legacy issues. Change can't come fast enough, as I probably only have 10 or 15 years left to play.
I was just having a discussion with my business partner about this very topic, albeit in a slightly different context. Content has been the proxy for value in a lot of professions for a long time. The best content won. Picture quality. Sound quality. Research details. Quality (as you say) was king, and an acceptance of lower quality was done as a concession for other realities, primarily cost.
Now, it's more about the experience. The ease of consumption, the integration (of everything), the universal availability of what I want, when I want it, wherever I want it. The vast majority of users will surrender accuracy for the experience. Thinking less is better than having to work harder, especially when so many don't know the difference between the right and wrong answers.
.
As for language, I'd suggest that it's one of the basic knowledge categories that people look for answers to. Chair. It's an object with pretty much a universal definition. Multiple legs, often with a back, used for sitting. I typed "what is the word for chair in the 10 most popular languages on the planet" into Bard just now...and it returned this..
Impressive, but basic. Sentence structure is more complex, but still relatively simple, as the rules are fixed. Would you give directions to your hotel using a free app to translate...sure. Would you buy a house or take a job after translating the contracts using a free tool? Probably not. It's a matter of risk/reward, and that kind of complex decisioning will take a very long time to program.
The human brain is an amazingly complex system, that we still don't quite understand completely. We can automate some of the basic tasks and heavy lifting, but decision-making for critical issues is a long, long, way off. At least in my (quite possibly very simple) mind. Quality has lost ground to speed recently, but the pendulum will swing and people will want assurance of quality for the things they value enough to pay for.
You would be surprised how good some of the machine translations are getting. I also long thought contextual based translation would be one step too far for a machine, but I was wrong. They are surprisingly good at it. In fact, they are frequently better than human translators. And getting better.
Talking of contracts I recently did a purchase contract for an American selling his apartment here and we decided to run it through DeepL and I'd tweak it a bit to make sure there were no gremlins in there. Admittedly, I changed a lot, but it wasn't too far off being acceptable for his purposes.
The biggest problem, as I see it, is the associated dumbing down of the audience that you refer to. A lot of younger translators are now forced to use machine translations (for cost reasons) but they just don't see the mistakes the machine makes. So over time a wrong translation becomes standard. Likewise a lot of non-native speakers think, yeah, that's just what I wanted to say, but they don't get the cultural nuance or understand what is the right thing to say in a certain situation.
I recently had one CEO wanting to thank the Chairman of the Supervisory Board (who was stepping down and was basically his boss) for "his cooperation", which is fine in German, as it is neutral, but in English it has a distinct whiff of "thanks for doing everything I told you to do" and IMO inappropriate. He ran it through DeepL and came back at me to say I was wrong.
So, yeah, great, whatever, dude, I thought. Needless to say, that was one of the clients I lost. This guy is also kind of tone-deaf and the kind of non-native speaker who is convinced he has full mastery of the English language, so I'm not too unhappy to have lost his business. But the ironic thing is the Chairman of the Supervisory Board is also a non-native speaker and wouldn't have had any problem with the phrase, so it turns out the only person not on the same page is me. In these situations, the machine is going to win.
You are giving me hope rgio!! But I have doubts about the bolded bit.
When DVD came out I had a friend who was a sound engineer and invested heavily in a new studio to produce high-end audio quality, believing the market would move to exploit the new sound quality it afforded (mainly for classical music). The opposite happened. The market moved to compressed files and mp3, mainly because everyone could suddenly rip their own files making them basically free. The same is now happening in my field. Machine translation can come up with some real clangers, but the translations are getting astonishingly better very rapidly. There comes a point when customers' quality expectations begin to fall in favour of something that is maybe not quite so good, but (almost) free.
And if something as complex as language can be handled, with all its nuance. I think it is only a question of time that it starts handling most other fields. All you need is a massive dataset, an ability to recognize patterns, apply logical arguments and mathematical probability models and hey presto, you've got something pretty damn close to the average brain.
I was just having a discussion with my business partner about this very topic, albeit in a slightly different context. Content has been the proxy for value in a lot of professions for a long time. The best content won. Picture quality. Sound quality. Research details. Quality (as you say) was king, and an acceptance of lower quality was done as a concession for other realities, primarily cost.
Now, it's more about the experience. The ease of consumption, the integration (of everything), the universal availability of what I want, when I want it, wherever I want it. The vast majority of users will surrender accuracy for the experience. Thinking less is better than having to work harder, especially when so many don't know the difference between the right and wrong answers.
.
As for language, I'd suggest that it's one of the basic knowledge categories that people look for answers to. Chair. It's an object with pretty much a universal definition. Multiple legs, often with a back, used for sitting. I typed "what is the word for chair in the 10 most popular languages on the planet" into Bard just now...and it returned this..
Impressive, but basic. Sentence structure is more complex, but still relatively simple, as the rules are fixed. Would you give directions to your hotel using a free app to translate...sure. Would you buy a house or take a job after translating the contracts using a free tool? Probably not. It's a matter of risk/reward, and that kind of complex decisioning will take a very long time to program.
The human brain is an amazingly complex system, that we still don't quite understand completely. We can automate some of the basic tasks and heavy lifting, but decision-making for critical issues is a long, long, way off. At least in my (quite possibly very simple) mind. Quality has lost ground to speed recently, but the pendulum will swing and people will want assurance of quality for the things they value enough to pay for.
Speed comes in many forms. The ability to consume is now immense, but machines must be trained by humans for true AI. The training speed is going to take time. Google has the internet mapped, and the first few answers I got from Bard were fabricated crap.
There is also a reality that content and mediation/training are quickly forming teams, and will charge more for focused knowledge. Chat GPT might do an OK job translating, but I'm sure you have a favorite or two already. You using those systems will train the datasets you have access to, but they won't improve translation for everyone. The moats will get wider and the barriers to entry greater.
This year everyone will use as much as they can. Soon, errors will begin to matter, and people will pull back from the blind acceptance of AI that's going on now. We have entered the peak of inflated expectations per Gartner's Hype cycle....the trough of disillusionment awaits.
You are giving me hope rgio!! But I have doubts about the bolded bit.
When DVD came out I had a friend who was a sound engineer and invested heavily in a new studio to produce high-end audio quality, believing the market would move to exploit the new sound quality it afforded (mainly for classical music). The opposite happened. The market moved to compressed files and mp3, mainly because everyone could suddenly rip their own files making them basically free. The same is now happening in my field. Machine translation can come up with some real clangers, but the translations are getting astonishingly better very rapidly. There comes a point when customers' quality expectations begin to fall in favour of something that is maybe not quite so good, but (almost) free.
And if something as complex as language can be handled, with all its nuance. I think it is only a question of time that it starts handling most other fields. All you need is a massive dataset, an ability to recognize patterns, apply logical arguments and mathematical probability models and hey presto, you've got something pretty damn close to the average brain.
I think it is going to go faster than you think. AI has reached a point where its ability to learn is outpacing its drawbacks. In my own field (translation) there have been serious inroads by AI tools but we have learned to incorporate them. Last year was my busiest ever. This year is also busy (so far).. but prices have plummeted. I can't afford to keep my employee on anymore, which means loss of quality and ultimately less competitiveness on a market dominated by increasingly powerful AI machines. I have lost one major client so far (well, two actually, but won three others) but still.. It's only a matter of time. Time to think about becoming a professional athlete I guess.
Speed comes in many forms. The ability to consume is now immense, but machines must be trained by humans for true AI. The training speed is going to take time. Google has the internet mapped, and the first few answers I got from Bard were fabricated crap.
There is also a reality that content and mediation/training are quickly forming teams, and will charge more for focused knowledge. Chat GPT might do an OK job translating, but I'm sure you have a favorite or two already. You using those systems will train the datasets you have access to, but they won't improve translation for everyone. The moats will get wider and the barriers to entry greater.
This year everyone will use as much as they can. Soon, errors will begin to matter, and people will pull back from the blind acceptance of AI that's going on now. We have entered the peak of inflated expectations per Gartner's Hype cycle....the trough of disillusionment awaits.
what's left? clergy, diplomats, plumbers.. might go for plumber.
I have a few friends who are plumbers - one can make a good living at it once you get established, but among the building trades, fewer people seem to want that one - especially when starting out. Yeah, I can't see AI being able to do much of what it involves - at least unless/until the methods and materials of construction change drastically.
It's going to be a long time before AI replaces almost all jobs. What will happen quickly is that it will separate occupations into "those using AI" and those not. A lawyer who effectively uses it will be much more productive than one who doesn't. Accountants, coders, and dozens of others.
#20 on the list is Small business owners. While it may not eliminate the need for people to run small companies, it very well may be a requirement to successfully market, manage, and grow a company. At some point, not using AI as a small business owner is going to put your company at a competitive disadvantage.
FWIW - I've been using both GPT4 and Bard, and they both have some serious problems.
I think it is going to go faster than you think. AI has reached a point where its ability to learn is outpacing its drawbacks. In my own field (translation) there have been serious inroads by AI tools but we have learned to incorporate them. Last year was my busiest ever. This year is also busy (so far).. but prices have plummeted. I can't afford to keep my employee on anymore, which means loss of quality and ultimately less competitiveness on a market dominated by increasingly powerful AI machines. I have lost one major client so far (well, two actually, but won three others) but still.. It's only a matter of time. Time to think about becoming a professional athlete I guess.
1. therapists
2. social workers
....
19. plumbers
20. small business owners
It's going to be a long time before AI replaces almost all jobs. What will happen quickly is that it will separate occupations into "those using AI" and those not. A lawyer who effectively uses it will be much more productive than one who doesn't. Accountants, coders, and dozens of others.
#20 on the list is Small business owners. While it may not eliminate the need for people to run small companies, it very well may be a requirement to successfully market, manage, and grow a company. At some point, not using AI as a small business owner is going to put your company at a competitive disadvantage.
FWIW - I've been using both GPT4 and Bard, and they both have some serious problems.
There are already quite a few people (entities?) selling prints of AI-generated "art" for substantial amounts of money. Midjourney is one of the popular apps (among very many) - just type in a prompt/description and it makes a "painting". Not necessarily "replacing" artists or designers, but not helping either, except perhaps in the cases of artists or designers using the apps as a visualization/layout tool.
ok, scratch artists.
what's left? clergy, diplomats, plumbers.. might go for plumber.
1. therapists
2. social workers
3. early childhood educators
4. nurses
5. artists
6. writers
......
ok, I'm going for 6 and 7. or maybe an occupational therapist for ethical marine biologists. choices!
There are already quite a few people (entities?) selling prints of AI-generated "art" for substantial amounts of money. Midjourney is one of the popular apps (among very many) - just type in a prompt/description and it makes a "painting". Not necessarily "replacing" artists or designers, but not helping either, except perhaps in the cases of artists or designers using the apps as a visualization/layout tool.
Elon Musk, who helped start OpenAI in 2015 before departing three years later, has accused ChatGPT of being âwokeâ and pledged to build his own version.
Gab, a social network with an avowedly Christian nationalist bent that has become a hub for white supremacists and extremists, has promised to release A.I. tools with âthe ability to generate content freely without the constraints of liberal propaganda wrapped tightly around its code.â
âSilicon Valley is investing billions to build these liberal guardrails to neuter the A.I. into forcing their worldview in the face of users and present it as ârealityâ or âfact,ââ Andrew Torba, the founder of Gab, said in a written response to questions.
He equated artificial intelligence to a new information arms race, like the advent of social media, that conservatives needed to win. âWe donât intend to allow our enemies to have the keys to the kingdom this time around,â he said.