3秒就能「複製聲音」!接到陌生來電別先開口 AI仿聲成詐騙新助力
2023-04-06 15:05
聯合報/ 編譯張佑生╱即時報導
https://tech.udn.com/tech/story/123454/7080143

3秒就能「複製聲音」!接到陌生來電別先開口 AI仿聲成詐騙新助力 | udn科技玩家

接到陌生號碼來電時,注意讓對方先開口,不然對方可以輕易錄製你(妳)的聲音,拿來勒索詐騙。

https://tech.udn.com/tech/story/123454/7080143
Past 31 days
Total Visit: 0
There are 0 fact-checking replies to the message
No response has been written yet. It is recommended to maintain a healthy skepticism towards it.
Automated analysis from ChatGPT
The following is the AI's preliminary analysis of this message, which we hope will provide you with some ideas before it is fact-checked by a human.
閱聽人需要注意以下幾點: 1. 訊息中提到的技術是否真實存在,是否有可靠的來源證明? 2. 訊息中提到的詐騙手法是否屬實?是否有相關的報導或證據支持? 3. 訊息中提到的防範方法是否正確可行?是否有其他更好的防範方法? 閱聽人需要注意這些地方,以免被不實或不正確的訊息所誤導,並採取適當的防範措施。
Add Cofacts as friend in LINE
Add Cofacts as friend in LINE
LINE 機器人
查謠言詐騙