亚洲激情专区-91九色丨porny丨老师-久久久久久久女国产乱让韩-国产精品午夜小视频观看

溫馨提示×

溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊×
其他方式登錄
點擊 登錄注冊 即表示同意《億速云用戶服務條款》

C# Unity怎么接入騰訊云實時語音識別

發布時間:2021-07-12 11:54:28 來源:億速云 閱讀:375 作者:chen 欄目:編程語言

這篇文章主要講解了“C# Unity怎么接入騰訊云實時語音識別”,文中的講解內容簡單清晰,易于學習與理解,下面請大家跟著小編的思路慢慢深入,一起來研究和學習“C# Unity怎么接入騰訊云實時語音識別”吧!

引入騰訊的 c# sdk中的工具類

https://github.com/TencentCloud/tencentcloud-sdk-dotnet

簽名工具類

https://github.com/TencentCloud/tencentcloud-sdk-dotnet/blob/master/TencentCloud/Common/Sign.cs

https://github.com/TencentCloud/tencentcloud-sdk-dotnet/blob/master/TencentCloud/Common/Profile/ClientProfile.cs

https://github.com/TencentCloud/tencentcloud-sdk-dotnet/blob/master/TencentCloud/Common/Profile/HttpProfile.cs

以Unity中為例接入騰訊云實時語音識別

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using UnityEngine;
using System.Net.WebSockets;
using System.Security.Cryptography;
using System.Text;
using System.Threading;
// using Newtonsoft.Json;
using UnityEngine.UI;
using System.Linq;
using UnityEngine.Events;
using TencentCloud.Common.Profile;
using TencentCloud.Common;
using sami.pegamob;

[Serializable]
public struct TencentAsrResponse
{
    public int code;
    public string message;
    public string voice_id;
    public string message_id;
    [SerializeField] public Result result;
    [Serializable]
    public struct Result
    {
        public int slice_type;
        public int index;
        public int start_time;
        public int end_time;
        public string voice_text_str;
        public int word_size;
        [Serializable]
        public struct Word
        {
            public string word;
            public int start_time;
            public int end_time;
            public int stable_flag;
        }
        [SerializeField] public Word[] word_list;

    }
    public int final;
}

public class TencentAsr : MonoBehaviour
{
    public string appid;
    public string secretid;
    public string secretkey;
    public string timestamp;
    public string expired;
    public string nonce;
    public string engine_model_type;
    public string voice_id;
    public string voice_format;
    public string signature;
    public bool pausing;
    /*
    1251200071	
    SecretId: AKIDDDm46Af1at9VNyNlFDvfDr0vbgWwh0kE
    SecretKey: OXOXEfp7QHsCjTwA76UcU5SZTD7qJcl1
    */
    public SortedDictionary<string, string> asrParams = new SortedDictionary<string, string>();

    public AudioClip RecordedClip;
    private ClientWebSocket ws;
    private CancellationToken ct;
    //最大錄音時長
    private int MAX_RECORD_LENGTH = 1000;//3599

    /// <summary>
    /// 語音識別回調事件
    /// </summary>
    public event Action<string> asrCallback;
    public Transform notifyTarget;
    // Start is called before the first frame update

    private void OnApplicationQuit()
    {
        //StopASR();
    }

    //wss://asr.cloud.tencent.com/asr/v2/1259228442?engine_model_type=16k_zh&expired=1592380492&filter_dirty=1&filter_modal=1&filter_punc=1&needvad=1&nonce=1592294092123&secretid=AKIDoQq1zhZMN8dv0psmvud6OUKuGPO7pu0r&timestamp=1592294092&voice_format=1&voice_id=RnKu9FODFHK5FPpsrN&signature=HepdTRX6u155qIPKNKC%2B3U0j1N0%3D
    //wss://asr.cloud.tencent.com/asr/v2/1251200071?engine_model_type=16k_en&expired=1617808000&nonce=1592294092123&secretid=AKIDDDm46Af1at9VNyNlFDvfDr0vbgWwh0kE&timestamp=1617721600&voice_format=1&voice_id=RnKu9FODFHK5FPpsrN&signature=T1RkUFNuSEJtZXdHbE1XMEJzSFRuRFBFWEVvPQ%3d%3d

    private Uri GetUri()
    {
        appid = "1251200071";
        secretid = "AKIDDDm46Af1at9VNyNlFDvfDr0vbgWwh0kE";
        secretkey = "OXOXEfp7QHsCjTwA76UcU5SZTD7qJcl1";
        timestamp = SamiTool.GetTimeStamp();
        expired = (SamiTool.GetTimeStampInt() + 24 * 3600).ToString();
        nonce = "1592294092123";//TODO隨機一個
        engine_model_type = "16k_en";
        voice_id = "RnKu9FODFHK5FPpsrN";//TODO隨機一個
        voice_format = "1";
        asrParams.Add("secretid", secretid);
        asrParams.Add("timestamp", timestamp);
        asrParams.Add("expired", expired);
        asrParams.Add("nonce", nonce);
        asrParams.Add("engine_model_type", engine_model_type);
        asrParams.Add("voice_id", voice_id);
        asrParams.Add("voice_format", voice_format);
        //word_info
        asrParams.Add("word_info", "1");
        string str = "asr.cloud.tencent.com/asr/v2/"+appid+"?" + SignHelper.BuildParamStr(asrParams);
        Debug.Log(str);
        Debug.Log(secretkey);
        signature = SamiTool.ToHmacSHA1(str,secretkey);
        Debug.Log(str);
        Debug.Log(signature);
        string url = "wss://" + str + "&signature=" + WWW.EscapeURL(signature);
        //string url = "wss://" + str + "&signature=" + signature;
        Debug.Log(url);
        return new Uri(url);
    }

    public void ConnectAsr()
    {
        //Uri asrUri = GetUri();
        StartASR();
    }

    public bool IsWsConnected()
    {
        bool connected = false;
        if (ws == null)
        {
            connected = false;
        }
        else
        {
            Debug.Log(ws.State);
            connected = (ws.State == WebSocketState.Connecting) || (ws.State == WebSocketState.Open);
        }
        return connected;
    }

    public void StartASR()
    {
        if (ws != null && ws.State == WebSocketState.Connecting)
        {
            Debug.LogWarning("上次識別連接中");
            return;
        }

        if (ws != null && ws.State == WebSocketState.Open)
        {
            Debug.LogWarning("開始語音識別失敗!,等待上次識別連接結束");
            return;
        }

        if (Microphone.devices.Length == 0)
        {
            Debug.LogError("未檢測到可用的麥克風");
            return;
        }

        ConnectASR_Aysnc();
        RecordedClip = Microphone.Start(null, false, MAX_RECORD_LENGTH, 16000);

    }

    public async void StopASR()
    {
        Debug.Log("VC StopASR");
        if (ws != null)
        {
            //關掉發送音頻的協程
            StopCoroutine(SendAudioClip());

            //Debug.Log("發送結束標識" + ws.State);
            //音頻數據上傳完成后,客戶端需發送一個 {"type": "end"} 到服務端作為結束標識
            try
            {
                await ws.SendAsync(new ArraySegment<byte>(Encoding.UTF8.GetBytes("{\"type\": \"end\"}")),WebSocketMessageType.Binary,true, new CancellationToken());
                await ws.CloseAsync(WebSocketCloseStatus.NormalClosure, "關閉WebSocket連接", new CancellationToken());
                ws.Dispose();
            }
            catch (System.Exception)
            {

                throw;
            }

            Microphone.End(null);
            StartCoroutine(StopRecord());
        }
    }

    private IEnumerator StopRecord()
    {
        yield return new WaitUntil(() => ws.State != WebSocketState.Open);
        Debug.Log("識別結束,停止錄音");
    }

    async void ConnectASR_Aysnc()
    {
        ws = new ClientWebSocket();
        ct = new CancellationToken();
        Uri url = GetUri();
        await ws.ConnectAsync(url, ct);
        StartCoroutine(SendAudioClip());
        StringBuilder stringBuilder = new StringBuilder();
        while (ws.State == WebSocketState.Open)
        {
            var result = new byte[4096];
            await ws.ReceiveAsync(new ArraySegment<byte>(result), ct); //接受數據
            List<byte> list = new List<byte>(result);
            while (list[list.Count - 1] == 0x00) list.RemoveAt(list.Count - 1); //去除空字節
            string str = Encoding.UTF8.GetString(list.ToArray());
            if (string.IsNullOrEmpty(str))
            {
                return;
            }

            try
            {
                TencentAsrResponse jsonData = JsonUtility.FromJson<TencentAsrResponse>(str);
                if (jsonData.code == 0)
                {
                    if (jsonData.message_id != null)
                    {
                        AnalysisResult(jsonData);
                    }
                    else
                    {
                        Debug.Log("握手成功!");
                        Debug.Log(str);
                    }
                }
                else
                {
                    Debug.Log("Error: " + jsonData.message);
                }
            }
            catch (Exception ex)
            {
                Debug.LogError(ex.Message + str);
            }
        }

        Debug.LogWarning("斷開連接");
    }

    IEnumerator SendAudioClip()
    {
        yield return new WaitWhile(() => Microphone.GetPosition(null) <= 0);
        float t = 0;
        int position = Microphone.GetPosition(null);
        const float waitTime = 0.04f; //每隔40ms發送音頻
        const int Maxlength = 1280; //最多發送1280字節
        int status = 0;
        int lastPosition = 0;
        while (position < RecordedClip.samples && ws.State == WebSocketState.Open)
        {
            t += waitTime;
            if (t >= MAX_RECORD_LENGTH)
            {
                Debug.Log("錄音時長已用盡,結束語音識別!");
                break;
            }

            yield return new WaitForSecondsRealtime(waitTime);
            if (Microphone.IsRecording(null))
            {
                position = Microphone.GetPosition(null);
                //Debug.Log("錄音時長:" + t + "position=" + position + ",lastPosition=" + lastPosition);
            }

            if (position <= lastPosition)
            {
                // 防止出現當前采樣位置和上一幀采樣位置一樣,導致length為0
                // 那么在調用AudioClip.GetData(float[] data, int offsetSamples);時,將報錯
                continue;
            }

            if (!pausing)
            {
                int length = position - lastPosition > Maxlength ? Maxlength : position - lastPosition;
                byte[] data = GetAudioClip(lastPosition, length, RecordedClip);
                // byte[] data = GetAudioClip(lastPosition, length, clipTest);
                ws.SendAsync(new ArraySegment<byte>(data), WebSocketMessageType.Binary, true,
                    new CancellationToken()); //發送數據
                lastPosition = lastPosition + length;
                status = 1;
            }
            
            
        }
    }

    public virtual void AnalysisResult(TencentAsrResponse tencentAsrResponse)
    {
        if(tencentAsrResponse.result.voice_text_str!=null&& tencentAsrResponse.result.voice_text_str!="")
        {
            //Debug.Log(tencentAsrResponse.result.voice_text_str);
            if (notifyTarget != null)
            {
                notifyTarget.SendMessage("OnWords", tencentAsrResponse);
                
            }
        }
    }

    public static byte[] GetAudioClip(int start, int length, AudioClip recordedClip)
    {
        float[] soundata = new float[length];
        recordedClip.GetData(soundata, start);
        int rescaleFactor = 32767;
        byte[] outData = new byte[soundata.Length * 2];
        for (int i = 0; i < soundata.Length; i++)
        {
            short temshort = (short)(soundata[i] * rescaleFactor);
            byte[] temdata = BitConverter.GetBytes(temshort);
            outData[i * 2] = temdata[0];
            outData[i * 2 + 1] = temdata[1];
        }

        return outData;
    }

    public static string saveTempWave(int start,int length, AudioClip recordedClip)
    {
        float[] soundata = new float[length];
        recordedClip.GetData(soundata, start);       
        AudioClip tempClip = AudioClip.Create("tempwav",length,recordedClip.channels,recordedClip.frequency,false);
        tempClip.SetData(soundata,0);
        string tempwave = "";
        WavUtility.FromAudioClip (tempClip, out tempwave, true);
        Debug.Log(tempwave);
        return tempwave;
    }
}

感謝各位的閱讀,以上就是“C# Unity怎么接入騰訊云實時語音識別”的內容了,經過本文的學習后,相信大家對C# Unity怎么接入騰訊云實時語音識別這一問題有了更深刻的體會,具體使用情況還需要大家實踐驗證。這里是億速云,小編將為大家推送更多相關知識點的文章,歡迎關注!

向AI問一下細節

免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。

AI

黎城县| 金塔县| 万源市| 六安市| 闵行区| 积石山| 深水埗区| 罗平县| 荆门市| 霍州市| 德昌县| 鄂伦春自治旗| 林口县| 淮滨县| 会泽县| 丰台区| 汤原县| 尉氏县| 中牟县| 象州县| 惠来县| 巨野县| 崇义县| 西华县| 兴隆县| 大方县| 遂平县| 天气| 香格里拉县| 富宁县| 潍坊市| 吴忠市| 镇平县| 吕梁市| 个旧市| 吴江市| 自贡市| 永济市| 苍梧县| 六枝特区| 宁河县|