"Openn i a" can cloning noises but it shouldn't do
The latest tool of “Oben A.” is so sensitive and controversial that the business has so far withheld them from offering it. The tool is called ‘Voice Engine’ or ‘Voice Market’, which is a system that can generate ‘sound that looks natural and very similar to the voice of the original speaker’, depending on only 15 seconds of his voice. It is true that this technology is not new, as emerging businesses such as “Eleven Labs” and “Hegen” have previously offered tools to cloning cloning from short sound samples, but “Oben A.” proved that they can deliver better products than present. However, it was more useful with ‘oben ia’ not to go to this field. The problem is not in technology, but at the insistence of “Oben A.” To make artificial intelligence accessible to everyone. The company said it would make its decision to launch the “Voice Peg” instrument in public after performing close scale tests and evaluating the results of “discussions” based on the societal reaction. There is no doubt that the cloning of clonies is associated with intuitive risks. While “oben A.” Recognize the increase in these risks in an important election year, and the company at the same time strives to “understand the boundaries of technology and share everything possible through artificial intelligence.” Here it should be remembered that “Oben A.” is no longer a non -profit organization, but rather a company that is obliged to retain its lead in the artificial intelligence race that has launched it. Is that true? So it is not surprising that “Oaben A.” Will host the “Voice Energy” instrument this year. The company caused such a noise in February 2019 when it partially launched its GPT-2 program, which is the language model that was previously ‘chat BT’, and then expressed its concern that the two random messages would utilize it. But it is only nine months until it introduced the complete form of the program and said that it “has not yet monitored strong evidence indicating the use of use.” Meanwhile, its motives have changed like “Owen A.” became a company that wants profits, and acquired an investment of one billion dollars from “Microsoft”. Is “oben I I” really careful, or does she take care of promoting her image? The company says that the goal is ‘beneficial artificial intelligence’, which is why it was normal to show the post in which it revealed ‘Voice Anjen’ examples of the benefits this tool will have, including a voice to patients and people with disabilities that cannot speak. Despite the nobility of these goals, the facilitation of life has always been used to give a human character to new technologies. It is first and foremost marketed to an audio for the transformative text software, as it aims to help the blind, but is later used to use general applications such as “Siri”, “Google Assistant” and navigation systems using the Global Positioning System. Illon Musk has promoted the ‘neuralink’ because it helps people with paralysis, but the long -term goal is to plant the slide in the brain of billions of people. A threat to people with disabilities on the ground, threatens artificial intelligence by making the lives of people with disabilities more. The instruments of artificial intelligence used to accidentally delete the applicants after work, and people with disabilities were found, while an investigation in 2023 found a ‘propublica’ that the insurance giant used “Cigna” algorithms that enabled doctors to reject a large number of applicants with insurance requests, which target excessive people with disabilities. “Signa” described the “Propupialica” report as “biased and incomplete.” The protection barriers proposed by “Oben A.” to deal with this technique with a lot of confidence. This suggests that you place a ‘forbidden list’ to prevent the production of sounds that are very similar to the sounds of ‘familiar characters’. However, the dangerous aspects of cloning noises will harm ordinary people more than celebrities. Most of the pornographic cuts that spread last year after developing the field of obstetric intelligence no longer targeted celebrities like normal women. The identity of the original speaker also does not always succeed like “Oben A.” intended to do. A short time ago, the Heagen, which is working with ‘Oben A.’, was used in the development of ‘Voice ingested’, to clone a Ukrainian influential voice on YouTube without his knowledge or consent, in my opinion. Olga Luke noted the Hi Jin watermark in one of hundreds of videos in which her body was used and her voice in a Chinese application on social communication. While “Hi Jin” on his website says she is asking for the person’s approval to use his voice, Luke found that this case appeared to be ineffective. “A technique that may not serve humanity. It should be noted that many of the examples that obtain aaaa pave the cloning of human voices the way for things that cannot be condemned, and it is an unnecessary risk. It is not just to put a new tool within the reach of fraud, bullying and other goal of deceptive information, but it will probably cause a problem in the entertainment sector, and take place “to practice the executives in this field and review its ‘Sora’ to generate videos. Driven in the race that its spark launched with the launch of “Chat BT”, which is currently under pressure to retain its lead by providing better copies of the competitive instruments and attracting more people to use the artificial intelligence systems it produces. For this reason, I recently removed the login condition for using “Chat BT”. “Oben A.” insists that it works out of his missionary message by creating artificial intelligence that serves humanity, but the damage that the reproduction of sounds can be seems greater and wider than its benefits. The company is well influenced by maintaining its position in the race in the field of business, but the fissure associated with public benefit is to get thumbs up.