上古神兵,先天至宝,Win11平台安装和配置NeoVim0.8.2编辑器搭建Python3开发环境(2023最新攻略)
毫无疑问,我们生活在编辑器的最好年代,Vim是仅在Vi之下的神级编辑器,而脱胎于Vim的NeoVim则是这个时代最好的编辑器,没有之一。异步支持、更好的内存管理、更快的渲染速度、更多的编辑命令,是大神Thiago de Arruda对开发者们最好的技术馈赠。

声音好听,颜值能打,基于PaddleGAN给人工智能AI语音模型配上动态画面(Python3.10)

借助So-vits我们可以自己训练五花八门的音色模型,然后复刻想要欣赏的任意歌曲,实现点歌自由,但有时候却又总觉得少了点什么,没错,缺少了画面,只闻其声,却不见其人,本次我们让AI川普的歌声和他伟岸的形象同时出现,基于PaddleGAN构建“靓声靓影”的“懂王”。 PaddlePaddle是百度开源的深度学习框架,其功能包罗万象,总计覆盖文本、图像、视频三大领域40个模型,可谓是在深度学习领域无所不窥。 PaddleGAN视觉效果模型中一个子模块Wav2lip是对开源库Wav2lip的二次封装和优化,它实现了人物口型与输入的歌词语音同步,说白了就是能让静态图的唇部动起来,让人物看起来仿佛正在唱歌。 除此以外,Wav2lip还可以直接将动态的视频,进行唇形替换,输出与目标语音相匹配的视频,如此一来,我们就可以通过AI直接定制属于自己的口播形象了。 本机配置CUDA和cudnn 要想把PaddlePaddle框架在本地跑起来,并非易事,但好在有国内深度学习领域的巨擘百度进行背书,文档资源非常丰富,只要按部就班,就不会出太大问题。 首先,在本地配置好Python3.10开发环境,参见:一网成擒全端涵盖,在不同架构(Intel x86/Apple m1 silicon)不同开发平台(Win10/Win11/Mac/Ubuntu)上安装配置Python3.10开发环境 随后,需要在本地配置好CUDA和cudnn,cudnn是基于CUDA的深度学习GPU加速库,有了它才能在GPU上完成深度学习的计算。它就相当于工作的工具,而CUDA作为计算平台,就需要cudnn的配合,这俩个在版本上必须配套。 首先点击N卡控制中心程序,查看本机N卡驱动所支持的CUDA版本: 从图上可知,笔者的显卡是RTX4060,当前驱动最大支持CUDA12.1的版本,换句话说只要是小于等于12.1的CUDA就都是支持的。 随后查看PaddlePaddle框架的官方文档,查看Python3.10所支持的框架版本: https://www.paddlepaddle.org.cn/documentation/docs/zh/install/Tables.html#ciwhls-release 根据文档可知,对于Python3.10来说,PaddlePaddle最高的支持版本是win-cuda11.6-cudnn8.4-mkl-vs2017-avx,也就是CUDA的版本是11.6,cudnn的版本是8.4,再高就不支持了。 所以本机需要安装CUDA11.6和cudnn8.4。 注意版本一定要吻合,否则后续无法启动程序。 知晓了版本号,我们只需要去N卡的官网下载安装包即可。 CUDA11.6安装包下载地址: https://developer.nvidia.com/cuda-toolkit-archive cudnn8.4安装包下载地址: https://developer.nvidia.com/rdp/cudnn-archive 首先安装CUDA11.6,安装完成后,解压cudnn8.4压缩包,将解压后的文件拷贝到CUDA11.6安装目录中即可,CUDA安装路径是: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.6 随后需要将bin目录添加到系统的环境变量中: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.6\bin 接着在终端进入demo文件夹: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.6\extras\demo_suite 执行bandwidthTest.exe命令,返回: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.6\extras\demo_suite>bandwidthTest.exe [CUDA Bandwidth Test] - Starting... Running on... Device 0: NVIDIA GeForce RTX 4060 Laptop GPU Quick Mode Host to Device Bandwidth, 1 Device(s) PINNED Memory Transfers Transfer Size (Bytes) Bandwidth(MB/s) 33554432 12477.8 Device to Host Bandwidth, 1 Device(s) PINNED Memory Transfers Transfer Size (Bytes) Bandwidth(MB/s) 33554432 12337.3 Device to Device Bandwidth, 1 Device(s) PINNED Memory Transfers Transfer Size (Bytes) Bandwidth(MB/s) 33554432 179907.9 Result = PASS NOTE: The CUDA Samples are not meant for performance measurements. Results may vary when GPU Boost is enabled. 即代表安装成功,随后可通过deviceQuery.exe查询GPU设备: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.6\extras\demo_suite>deviceQuery.exe deviceQuery.exe Starting... CUDA Device Query (Runtime API) version (CUDART static linking) Detected 1 CUDA Capable device(s) Device 0: "NVIDIA GeForce RTX 4060 Laptop GPU" CUDA Driver Version / Runtime Version 12.1 / 11.6 CUDA Capability Major/Minor version number: 8.9 Total amount of global memory: 8188 MBytes (8585216000 bytes) MapSMtoCores for SM 8.9 is undefined. Default to use 128 Cores/SM MapSMtoCores for SM 8.9 is undefined. Default to use 128 Cores/SM (24) Multiprocessors, (128) CUDA Cores/MP: 3072 CUDA Cores GPU Max Clock rate: 2370 MHz (2.37 GHz) Memory Clock rate: 8001 Mhz Memory Bus Width: 128-bit L2 Cache Size: 33554432 bytes Maximum Texture Dimension Size (x,y,z) 1D=(131072), 2D=(131072, 65536), 3D=(16384, 16384, 16384) Maximum Layered 1D Texture Size, (num) layers 1D=(32768), 2048 layers Maximum Layered 2D Texture Size, (num) layers 2D=(32768, 32768), 2048 layers Total amount of constant memory: zu bytes Total amount of shared memory per block: zu bytes Total number of registers available per block: 65536 Warp size: 32 Maximum number of threads per multiprocessor: 1536 Maximum number of threads per block: 1024 Max dimension size of a thread block (x,y,z): (1024, 1024, 64) Max dimension size of a grid size (x,y,z): (2147483647, 65535, 65535) Maximum memory pitch: zu bytes Texture alignment: zu bytes Concurrent copy and kernel execution: Yes with 1 copy engine(s) Run time limit on kernels: Yes Integrated GPU sharing Host Memory: No Support host page-locked memory mapping: Yes Alignment requirement for Surfaces: Yes Device has ECC support: Disabled CUDA Device Driver Mode (TCC or WDDM): WDDM (Windows Display Driver Model) Device supports Unified Addressing (UVA): Yes Device supports Compute Preemption: Yes Supports Cooperative Kernel Launch: Yes Supports MultiDevice Co-op Kernel Launch: No Device PCI Domain ID / Bus ID / location ID: 0 / 1 / 0 Compute Mode: < Default (multiple host threads can use ::cudaSetDevice() with device simultaneously) > deviceQuery, CUDA Driver = CUDART, CUDA Driver Version = 12.1, CUDA Runtime Version = 11.6, NumDevs = 1, Device0 = NVIDIA GeForce RTX 4060 Laptop GPU Result = PASS 至此,CUDA和cudnn就配置好了。 配置PaddlePaddle框架 配置好CUDA之后,让我们来安装PaddlePaddle框架: python -m pip install paddlepaddle-gpu==2.4.2.post116 -f https://www.paddlepaddle.org.cn/whl/windows/mkl/avx/stable.html 这里安装paddlepaddle的gpu版本,版本号是2.4.2.post116,2.4是最新版,其中116就代表Cuda的版本,注意版本一定不能弄错。 随后克隆PaddleGan项目: git clone https://gitee.com/PaddlePaddle/PaddleGAN 运行命令本地编译安装PaddleGan项目: pip install -v -e . 随后再安装其他依赖: pip install -r requirements.txt 这里有几个坑,需要说明一下: 首先PaddleGan依赖的numpy库还是老版本,它不支持最新的1.24版本,所以如果您的numpy版本是1.24,需要先把numpy卸载了: pip uninstall numpy 随后安装1.21版本: pip install numpy==1.21 接着在Python终端中验证PaddleGan是否安装成功: import paddle paddle.utils.run_check() 如果报这个错误: PreconditionNotMetError: The third-party dynamic library (cudnn64_7.dll) that Paddle depends on is not configured correctly. (error code is 126) Suggestions: 1. Check if the third-party dynamic library (e.g. CUDA, CUDNN) is installed correctly and its version is matched with paddlepaddle you installed. 2. Configure third-party dynamic library environment variables as follows: - Linux: set LD_LIBRARY_PATH by `export LD_LIBRARY_PATH=...` - Windows: set PATH by `set PATH=XXX; (at ..\paddle\phi\backends\dynload\dynamic_loader.cc:305) [operator < fill_constant > error] 则需要下载cudnn64\_7.dll动态库,然后复制到CUDA11.6的bin目录中,动态库地址后面会贴出来。 再次运行验证程序,返回: Python 3.10.11 (tags/v3.10.11:7d4cc5a, Apr 5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>> import paddle >>> paddle.utils.run_check() Running verify PaddlePaddle program ... W0517 20:15:34.881800 31592 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 8.9, Driver API Version: 12.1, Runtime API Version: 11.6 W0517 20:15:34.889958 31592 gpu_resources.cc:91] device: 0, cuDNN Version: 8.4. PaddlePaddle works well on 1 GPU. PaddlePaddle works well on 1 GPUs. PaddlePaddle is installed successfully! Let's start deep learning with PaddlePaddle now. 说明大功告成,安装成功。 下面我们给川普的歌曲配上动态画面,首先通过Stable-Diffusion生成一张懂王的静态图片: 关于Stable-Diffusion,请移步:人工智能,丹青圣手,全平台(原生/Docker)构建Stable-Diffusion-Webui的AI绘画库教程(Python3.10/Pytorch1.13.0),囿于篇幅,这里不再赘述。 接着进入到项目的tools目录: \PaddleGAN\applications\tools> 将川普的静态图片和歌曲文件放入tools目录中。 接着运行命令,进行本地推理: python .\wav2lip.py --face .\Trump.jpg --audio test.wav --outfile pp_put.mp4 --face_enhancement 这里--face是目标图片,--audio则是需要匹配唇形的歌曲,--outfile参数是输出视频。 face\_enhancement:参数可以添加人脸增强,不添加参数默认为不使用增强功能。 但添加了这个参数需要单独下载模型文件。 Wav2Lip实现唇形与语音精准同步突破的关键在于,它采用了唇形同步判别器,以强制生成器持续产生准确而逼真的唇部运动。此外,它通过在鉴别器中使用多个连续帧而不是单个帧,并使用视觉质量损失(而不仅仅是对比损失)来考虑时间相关性,从而改善了视觉质量。 具体效果: 有的时候,人工智能AI技术的发展真的会让人有一种恍若隔世的感觉,耳听未必为实,眼见也未必为真。最后,成品视频可在Youtube平台(B站)搜索:刘悦的技术博客,欢迎诸君品鉴,本文所有涉及的安装包和动态库请参见: https://pan.baidu.com/s/1-6NA2uAOSRlT4O0FGEKUGA?pwd=oo0d 提取码:oo0d

云端炼丹,算力白嫖,基于云端GPU(Colab)使用So-vits库制作AI特朗普演唱《国际歌》

人工智能AI技术早已深入到人们生活的每一个角落,君不见AI孙燕姿的歌声此起彼伏,不绝于耳,但并不是每个人都拥有一块N卡,没有GPU的日子总是不好过的,但是没关系,山人有妙计,本次我们基于Google的Colab免费云端服务器来搭建深度学习环境,制作AI特朗普,让他高唱《国际歌》。 Colab(全名Colaboratory ),它是Google公司的一款基于云端的基础免费服务器产品,可以在B端,也就是浏览器里面编写和执行Python代码,非常方便,贴心的是,Colab可以给用户分配免费的GPU进行使用,对于没有N卡的朋友来说,这已经远远超出了业界良心的范畴,简直就是在做慈善事业。配置ColabColab是基于Google云盘的产品,我们可以将深度学习的Python脚本、训练好的模型、以及训练集等数据直接存放在云盘中,然后通过Colab执行即可。 首先访问Google云盘:drive.google.com 随后点击新建,选择关联更多应用: 接着安装Colab即可: 至此,云盘和Colab就关联好了,现在我们可以新建一个脚本文件my\_sovits.ipynb文件,键入代码:hello colab随后,按快捷键 ctrl + 回车,即可运行代码: 这里需要注意的是,Colab使用的是基于Jupyter Notebook的ipynb格式的Python代码。Jupyter Notebook是以网页的形式打开,可以在网页页面中直接编写代码和运行代码,代码的运行结果也会直接在代码块下显示。如在编程过程中需要编写说明文档,可在同一个页面中直接编写,便于作及时的说明和解释。随后设置一下显卡类型: 接着运行命令,查看GPU版本:!/usr/local/cuda/bin/nvcc --version !nvidia-smi程序返回:nvcc: NVIDIA (R) Cuda compiler driver Built on Wed_Sep_21_10:33:58_PDT_2022 Cuda compilation tools, release 11.8, V11.8.89 Build cuda_11.8.r11.8/compiler.31833905_0 Tue May 16 04:49:23 2023 +-----------------------------------------------------------------------------+ | NVIDIA-SMI 525.85.12 Driver Version: 525.85.12 CUDA Version: 12.0 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |===============================+======================+======================| | 0 Tesla T4 Off | 00000000:00:04.0 Off | 0 | | N/A 65C P8 13W / 70W | 0MiB / 15360MiB | 0% Default | | | | N/A | +-------------------------------+----------------------+----------------------+ +-----------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=============================================================================| | No running processes found | +-----------------------------------------------------------------------------+这里建议选择Tesla T4的显卡类型,性能更突出。 至此Colab就配置好了。配置So-vits下面我们配置so-vits环境,可以通过pip命令安装一些基础依赖:!pip install pyworld==0.3.2 !pip install numpy==1.23.5注意jupyter语言是通过叹号来运行命令。 注意,由于不是本地环境,有的时候colab会提醒:Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Collecting numpy==1.23.5 Downloading numpy-1.23.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 17.1/17.1 MB 80.1 MB/s eta 0:00:00 Installing collected packages: numpy Attempting uninstall: numpy Found existing installation: numpy 1.22.4 Uninstalling numpy-1.22.4: Successfully uninstalled numpy-1.22.4 Successfully installed numpy-1.23.5 WARNING: The following packages were previously imported in this runtime: [numpy] You must restart the runtime in order to use newly installed versions.此时numpy库需要重启runtime才可以导入操作。 重启runtime后,需要再重新安装一次,直到系统提示依赖已经存在:Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Requirement already satisfied: numpy==1.23.5 in /usr/local/lib/python3.10/dist-packages (1.23.5)随后,克隆so-vits项目,并且安装项目的依赖:import os import glob !git clone https://github.com/effusiveperiscope/so-vits-svc -b eff-4.0 os.chdir('/content/so-vits-svc') # install requirements one-at-a-time to ignore exceptions !cat requirements.txt | xargs -n 1 pip install --extra-index-url https://download.pytorch.org/whl/cu117 !pip install praat-parselmouth !pip install ipywidgets !pip install huggingface_hub !pip install pip==23.0.1 # fix pip version for fairseq install !pip install fairseq==0.12.2 !jupyter nbextension enable --py widgetsnbextension existing_files = glob.glob('/content/**/*.*', recursive=True) !pip install --upgrade protobuf==3.9.2 !pip uninstall -y tensorflow !pip install tensorflow==2.11.0安装好依赖之后,定义一些前置工具方法:os.chdir('/content/so-vits-svc') # force working-directory to so-vits-svc - this line is just for safety and is probably not required import tarfile import os from zipfile import ZipFile # taken from https://github.com/CookiePPP/cookietts/blob/master/CookieTTS/utils/dataset/extract_unknown.py def extract(path): if path.endswith(".zip"): with ZipFile(path, 'r') as zipObj: zipObj.extractall(os.path.split(path)[0]) elif path.endswith(".tar.bz2"): tar = tarfile.open(path, "r:bz2") tar.extractall(os.path.split(path)[0]) tar.close() elif path.endswith(".tar.gz"): tar = tarfile.open(path, "r:gz") tar.extractall(os.path.split(path)[0]) tar.close() elif path.endswith(".tar"): tar = tarfile.open(path, "r:") tar.extractall(os.path.split(path)[0]) tar.close() elif path.endswith(".7z"): import py7zr archive = py7zr.SevenZipFile(path, mode='r') archive.extractall(path=os.path.split(path)[0]) archive.close() else: raise NotImplementedError(f"{path} extension not implemented.") # taken from https://github.com/CookiePPP/cookietts/tree/master/CookieTTS/_0_download/scripts # megatools download urls win64_url = "https://megatools.megous.com/builds/builds/megatools-1.11.1.20230212-win64.zip" win32_url = "https://megatools.megous.com/builds/builds/megatools-1.11.1.20230212-win32.zip" linux_url = "https://megatools.megous.com/builds/builds/megatools-1.11.1.20230212-linux-x86_64.tar.gz" # download megatools from sys import platform import os import urllib.request import subprocess from time import sleep if platform == "linux" or platform == "linux2": dl_url = linux_url elif platform == "darwin": raise NotImplementedError('MacOS not supported.') elif platform == "win32": dl_url = win64_url else: raise NotImplementedError ('Unknown Operating System.') dlname = dl_url.split("/")[-1] if dlname.endswith(".zip"): binary_folder = dlname[:-4] # remove .zip elif dlname.endswith(".tar.gz"): binary_folder = dlname[:-7] # remove .tar.gz else: raise NameError('downloaded megatools has unknown archive file extension!') if not os.path.exists(binary_folder): print('"megatools" not found. Downloading...') if not os.path.exists(dlname): urllib.request.urlretrieve(dl_url, dlname) assert os.path.exists(dlname), 'failed to download.' extract(dlname) sleep(0.10) os.unlink(dlname) print("Done!") binary_folder = os.path.abspath(binary_folder) def megadown(download_link, filename='.', verbose=False): """Use megatools binary executable to download files and folders from MEGA.nz .""" filename = ' --path "'+os.path.abspath(filename)+'"' if filename else "" wd_old = os.getcwd() os.chdir(binary_folder) if platform == "linux" or platform == "linux2": subprocess.call(f'./megatools dl{filename}{" --debug http" if verbose else ""} {download_link}', shell=True) elif platform == "win32": subprocess.call(f'megatools.exe dl{filename}{" --debug http" if verbose else ""} {download_link}', shell=True) except: os.chdir(wd_old) # don't let user stop download without going back to correct directory first raise os.chdir(wd_old) return filename import urllib.request from tqdm import tqdm import gdown from os.path import exists def request_url_with_progress_bar(url, filename): class DownloadProgressBar(tqdm): def update_to(self, b=1, bsize=1, tsize=None): if tsize is not None: self.total = tsize self.update(b * bsize - self.n) def download_url(url, filename): with DownloadProgressBar(unit='B', unit_scale=True, miniters=1, desc=url.split('/')[-1]) as t: filename, headers = urllib.request.urlretrieve(url, filename=filename, reporthook=t.update_to) print("Downloaded to "+filename) download_url(url, filename) def download(urls, dataset='', filenames=None, force_dl=False, username='', password='', auth_needed=False): assert filenames is None or len(urls) == len(filenames), f"number of urls does not match filenames. Expected {len(filenames)} urls, containing the files listed below.\n{filenames}" assert not auth_needed or (len(username) and len(password)), f"username and password needed for {dataset} Dataset" if filenames is None: filenames = [None,]*len(urls) for i, (url, filename) in enumerate(zip(urls, filenames)): print(f"Downloading File from {url}") #if filename is None: # filename = url.split("/")[-1] if filename and (not force_dl) and exists(filename): print(f"{filename} Already Exists, Skipping.") continue if 'drive.google.com' in url: assert 'https://drive.google.com/uc?id=' in url, 'Google Drive links should follow the format "https://drive.google.com/uc?id=1eQAnaoDBGQZldPVk-nzgYzRbcPSmnpv6".\nWhere id=XXXXXXXXXXXXXXXXX is the Google Drive Share ID.' gdown.download(url, filename, quiet=False) elif 'mega.nz' in url: megadown(url, filename) else: #urllib.request.urlretrieve(url, filename=filename) # no progress bar request_url_with_progress_bar(url, filename) # with progress bar import huggingface_hub import os import shutil class HFModels: def __init__(self, repo = "therealvul/so-vits-svc-4.0", model_dir = "hf_vul_models"): self.model_repo = huggingface_hub.Repository(local_dir=model_dir, clone_from=repo, skip_lfs_files=True) self.repo = repo self.model_dir = model_dir self.model_folders = os.listdir(model_dir) self.model_folders.remove('.git') self.model_folders.remove('.gitattributes') def list_models(self): return self.model_folders # Downloads model; # copies config to target_dir and moves model to target_dir def download_model(self, model_name, target_dir): if not model_name in self.model_folders: raise Exception(model_name + " not found") model_dir = self.model_dir charpath = os.path.join(model_dir,model_name) gen_pt = next(x for x in os.listdir(charpath) if x.startswith("G_")) cfg = next(x for x in os.listdir(charpath) if x.endswith("json")) clust = next(x for x in os.listdir(charpath) if x.endswith("pt")) except StopIteration as e: print("Note - no cluster model for "+model_name) clust = None if not os.path.exists(target_dir): os.makedirs(target_dir, exist_ok=True) gen_dir = huggingface_hub.hf_hub_download(repo_id = self.repo, filename = model_name + "/" + gen_pt) # this is a symlink if clust is not None: clust_dir = huggingface_hub.hf_hub_download(repo_id = self.repo, filename = model_name + "/" + clust) # this is a symlink shutil.move(os.path.realpath(clust_dir), os.path.join(target_dir, clust)) clust_out = os.path.join(target_dir, clust) else: clust_out = None shutil.copy(os.path.join(charpath,cfg),os.path.join(target_dir, cfg)) shutil.move(os.path.realpath(gen_dir), os.path.join(target_dir, gen_pt)) return {"config_path": os.path.join(target_dir,cfg), "generator_path": os.path.join(target_dir,gen_pt), "cluster_path": clust_out} # Example usage # vul_models = HFModels() # print(vul_models.list_models()) # print("Applejack (singing)" in vul_models.list_models()) # vul_models.download_model("Applejack (singing)","models/Applejack (singing)") print("Finished!")这些方法可以帮助我们下载、解压和加载模型。音色模型下载和线上推理接着将特朗普的音色模型和配置文件进行下载,下载地址是:https://huggingface.co/Nardicality/so-vits-svc-4.0-models/tree/main/Trump18.5k随后模型文件放到项目的models文件夹,配置文件则放入config文件夹。 接着将需要转换的歌曲上传到和项目平行的目录中。 运行代码:import os import glob import json import copy import logging import io from ipywidgets import widgets from pathlib import Path from IPython.display import Audio, display os.chdir('/content/so-vits-svc') import torch from inference import infer_tool from inference import slicer from inference.infer_tool import Svc import soundfile import numpy as np MODELS_DIR = "models" def get_speakers(): speakers = [] for _,dirs,_ in os.walk(MODELS_DIR): for folder in dirs: cur_speaker = {} # Look for G_****.pth g = glob.glob(os.path.join(MODELS_DIR,folder,'G_*.pth')) if not len(g): print("Skipping "+folder+", no G_*.pth") continue cur_speaker["model_path"] = g[0] cur_speaker["model_folder"] = folder # Look for *.pt (clustering model) clst = glob.glob(os.path.join(MODELS_DIR,folder,'*.pt')) if not len(clst): print("Note: No clustering model found for "+folder) cur_speaker["cluster_path"] = "" else: cur_speaker["cluster_path"] = clst[0] # Look for config.json cfg = glob.glob(os.path.join(MODELS_DIR,folder,'*.json')) if not len(cfg): print("Skipping "+folder+", no config json") continue cur_speaker["cfg_path"] = cfg[0] with open(cur_speaker["cfg_path"]) as f: cfg_json = json.loads(f.read()) except Exception as e: print("Malformed config json in "+folder) for name, i in cfg_json["spk"].items(): cur_speaker["name"] = name cur_speaker["id"] = i if not name.startswith('.'): speakers.append(copy.copy(cur_speaker)) return sorted(speakers, key=lambda x:x["name"].lower()) logging.getLogger('numba').setLevel(logging.WARNING) chunks_dict = infer_tool.read_temp("inference/chunks_temp.json") existing_files = [] slice_db = -40 wav_format = 'wav' class InferenceGui(): def __init__(self): self.speakers = get_speakers() self.speaker_list = [x["name"] for x in self.speakers] self.speaker_box = widgets.Dropdown( options = self.speaker_list display(self.speaker_box) def convert_cb(btn): self.convert() def clean_cb(btn): self.clean() self.convert_btn = widgets.Button(description="Convert") self.convert_btn.on_click(convert_cb) self.clean_btn = widgets.Button(description="Delete all audio files") self.clean_btn.on_click(clean_cb) self.trans_tx = widgets.IntText(value=0, description='Transpose') self.cluster_ratio_tx = widgets.FloatText(value=0.0, description='Clustering Ratio') self.noise_scale_tx = widgets.FloatText(value=0.4, description='Noise Scale') self.auto_pitch_ck = widgets.Checkbox(value=False, description= 'Auto pitch f0 (do not use for singing)') display(self.trans_tx) display(self.cluster_ratio_tx) display(self.noise_scale_tx) display(self.auto_pitch_ck) display(self.convert_btn) display(self.clean_btn) def convert(self): trans = int(self.trans_tx.value) speaker = next(x for x in self.speakers if x["name"] == self.speaker_box.value) spkpth2 = os.path.join(os.getcwd(),speaker["model_path"]) print(spkpth2) print(os.path.exists(spkpth2)) svc_model = Svc(speaker["model_path"], speaker["cfg_path"], cluster_model_path=speaker["cluster_path"]) input_filepaths = [f for f in glob.glob('/content/**/*.*', recursive=True) if f not in existing_files and any(f.endswith(ex) for ex in ['.wav','.flac','.mp3','.ogg','.opus'])] for name in input_filepaths: print("Converting "+os.path.split(name)[-1]) infer_tool.format_wav(name) wav_path = str(Path(name).with_suffix('.wav')) wav_name = Path(name).stem chunks = slicer.cut(wav_path, db_thresh=slice_db) audio_data, audio_sr = slicer.chunks2audio(wav_path, chunks) audio = [] for (slice_tag, data) in audio_data: print(f'#=====segment start, ' f'{round(len(data)/audio_sr, 3)}s======') length = int(np.ceil(len(data) / audio_sr * svc_model.target_sample)) if slice_tag: print('jump empty segment') _audio = np.zeros(length) else: # Padding "fix" for noise pad_len = int(audio_sr * 0.5) data = np.concatenate([np.zeros([pad_len]), data, np.zeros([pad_len])]) raw_path = io.BytesIO() soundfile.write(raw_path, data, audio_sr, format="wav") raw_path.seek(0) _cluster_ratio = 0.0 if speaker["cluster_path"] != "": _cluster_ratio = float(self.cluster_ratio_tx.value) out_audio, out_sr = svc_model.infer( speaker["name"], trans, raw_path, cluster_infer_ratio = _cluster_ratio, auto_predict_f0 = bool(self.auto_pitch_ck.value), noice_scale = float(self.noise_scale_tx.value)) _audio = out_audio.cpu().numpy() pad_len = int(svc_model.target_sample * 0.5) _audio = _audio[pad_len:-pad_len] audio.extend(list(infer_tool.pad_array(_audio, length))) res_path = os.path.join('/content/', f'{wav_name}_{trans}_key_' f'{speaker["name"]}.{wav_format}') soundfile.write(res_path, audio, svc_model.target_sample, format=wav_format) display(Audio(res_path, autoplay=True)) # display audio file def clean(self): input_filepaths = [f for f in glob.glob('/content/**/*.*', recursive=True) if f not in existing_files and any(f.endswith(ex) for ex in ['.wav','.flac','.mp3','.ogg','.opus'])] for f in input_filepaths: os.remove(f) inference_gui = InferenceGui()此时系统会自动在根目录,也就是content下寻找音乐文件,包含但不限于wav、flac、mp3等等,随后根据下载的模型进行推理,推理之前会自动对文件进行背景音分离以及降噪和切片等操作。 推理结束之后,会自动播放转换后的歌曲。结语如果是刚开始使用Colab,默认分配的显存是15G左右,完全可以胜任大多数训练和推理任务,但是如果经常用它挂机运算,能分配到的显卡配置就会渐进式地降低,如果需要长时间并且相对稳定的GPU资源,还是需要付费订阅Colab pro服务,另外Google云盘的免费使用空间也是15G,如果模型下多了,导致云盘空间不足,运行代码也会报错,所以最好定期清理Google云盘,以此保证深度学习任务的正常运行。

民谣女神唱流行,基于AI人工智能so-vits库训练自己的音色模型(叶蓓/Python3.10)

流行天后孙燕姿的音色固然是极好的,但是目前全网都是她的声音复刻,听多了难免会有些审美疲劳,在网络上检索了一圈,还没有发现民谣歌手的音色模型,人就是这样,得不到的永远在骚动,本次我们自己构建训练集,来打造自己的音色模型,让民谣女神来唱流行歌曲,要多带劲就有多带劲。构建训练集训练集是指用于训练神经网络模型的数据集合。这个数据集通常由大量的输入和对应的输出组成,神经网络模型通过学习输入和输出之间的关系来进行训练,并且在训练过程中调整模型的参数以最小化误差。 通俗地讲,如果我们想要训练民谣歌手叶蓓的音色模型,就需要将她的歌曲作为输入参数,也就是训练集,训练集的作用是为模型提供学习的材料,使其能够从输入数据中学习到正确的输出。通过反复迭代训练集,神经网络模型可以不断地优化自身,提高其对输入数据的预测能力。 没错,so-vits库底层就是神经网络架构,而训练音色模型库,本质上解决的是预测问题,关于神经网络架构,请移步:人工智能机器学习底层原理剖析,人造神经元,您一定能看懂,通俗解释把AI“黑话”转化为“白话文”,这里不再赘述。 选择训练集样本时,最好选择具有歌手音色“特质”的歌曲,为什么全网都是孙燕姿?只是因为她的音色辨识度太高,模型可以从输入数据中更容易地学习到正确的输出。 此外,训练集数据贵精不贵多,特征权重比较高的清晰样本,在训练效果要比低质量样本要好,比如歌手“翻唱”的一些歌曲,或者使用非常规唱法的歌曲,这类样本虽然也具备一些歌手的音色特征,但对于模型训练来说,实际上起到是反作用,这是需要注意的事情。 这里选择叶蓓早期专辑《幸福深处》中的六首歌: 通常来说,训练集的数量越多,模型的性能就越好,但是在实践中,需要根据实际情况进行权衡和选择。在深度学习中,通常需要大量的数据才能训练出高性能的模型。例如,在计算机视觉任务中,需要大量的图像数据来训练卷积神经网络模型。但是,在其他一些任务中,如语音识别和自然语言处理,相对较少的数据量也可以训练出高性能的模型。通常,需要确保训练集中包含充足、多样的样本,以覆盖所有可能的输入情况。此外,训练集中需要包含足够的正样本和负样本,以保证模型的分类性能。除了数量之外,训练集的质量也非常重要。需要确保训练集中不存在偏差和噪声,同时需要进行数据清洗和数据增强等预处理操作,以提高训练集的质量和多样性。总的来说,训练集的数量要求需要根据具体问题进行调整,需要考虑问题的复杂性、数据的多样性、模型的复杂度和训练算法的效率等因素。在实践中,需要进行实验和验证,找到最适合问题的训练集规模。 综上,考虑到笔者的电脑配置以及训练时间成本,训练集相对较小,其他朋友可以根据自己的情况丰俭由己地进行调整。训练集数据清洗准备好训练集之后,我们需要对数据进行“清洗”,也就是去掉歌曲中的伴奏、停顿以及混音部分,只留下“清唱”的版本。 伴奏和人声分离推荐使用spleeter库:pip3 install spleeter --user接着运行命令,对训练集歌曲进行分离操作:spleeter separate -o d:/output/ -p spleeter:2stems d:/数据.mp3这里-o代表输出目录,-p代表选择的分离模型,最后是要分离的素材。首次运行会比较慢,因为spleeter会下载预训练模型,体积在1.73g左右,运行完毕后,会在输出目录生成分离后的音轨文件:D:\歌曲制作\清唱 的目录 2023/05/11 15:38 <DIR> . 2023/05/11 13:45 <DIR> .. 2023/05/11 13:40 39,651,884 1_1_01. wxs.wav 2023/05/11 15:34 46,103,084 1_1_02. qad_(Vocals)_(Vocals).wav 2023/05/11 15:35 43,802,924 1_1_03. hs_(Vocals)_(Vocals).wav 2023/05/11 15:36 39,054,764 1_1_04. hope_(Vocals)_(Vocals).wav 2023/05/11 15:36 32,849,324 1_1_05. kamen_(Vocals)_(Vocals).wav 2023/05/11 15:37 50,741,804 1_1_06. ctrl_(Vocals)_(Vocals).wav 6 个文件 252,203,784 字节 2 个目录 449,446,780,928 可用字节关于spleeter更多的操作,请移步至:人工智能AI库Spleeter免费人声和背景音乐分离实践(Python3.10), 这里不再赘述。 分离后的数据样本还需要二次处理,因为分离后的音频本身还会带有一些轻微的背景音和混音,这里推荐使用noisereduce库:pip3 install noisereduce,soundfile随后进行降噪处理:import noisereduce as nr import soundfile as sf # 读入音频文件 data, rate = sf.read("audio_file.wav") # 获取噪声样本 noisy_part = data[10000:15000] # 估算噪声 noise = nr.estimate_noise(noisy_part, rate) # 应用降噪算法 reduced_noise = nr.reduce_noise(audio_clip=data, noise_clip=noise, verbose=False) # 将结果写入文件 sf.write("audio_file_denoised.wav", reduced_noise, rate)先通过soundfile库将歌曲文件读出来,然后获取噪声样本并对其使用降噪算法,最后写入新文件。至此,数据清洗工作基本完成。训练集数据切分深度学习过程中,计算机会把训练数据读入显卡的缓存中,但如果训练集数据过大,会导致内存溢出问题,也就是常说的“爆显存”现象。 将数据集分成多个部分,每次只载入一个部分的数据进行训练。这种方法可以减少内存使用,同时也可以实现并行处理,提高训练效率。 这里可以使用github.com/openvpi/audio-slicer库:git clone https://github.com/openvpi/audio-slicer.git随后编写代码:import librosa # Optional. Use any library you like to read audio files. import soundfile # Optional. Use any library you like to write audio files. from slicer2 import Slicer audio, sr = librosa.load('example.wav', sr=None, mono=False) # Load an audio file with librosa. slicer = Slicer( sr=sr, threshold=-40, min_length=5000, min_interval=300, hop_size=10, max_sil_kept=500 chunks = slicer.slice(audio) for i, chunk in enumerate(chunks): if len(chunk.shape) > 1: chunk = chunk.T # Swap axes if the audio is stereo. soundfile.write(f'clips/example_{i}.wav', chunk, sr) # Save sliced audio files with soundfile.该脚本可以将所有降噪后的清唱样本切成小样本,方便训练,电脑配置比较低的朋友,可以考虑将min\_interval和max\_sil\_kept调的更高一些,这些会切的更碎,所谓“细细切做臊子”。 最后,六首歌被切成了140个小样本:D:\歌曲制作\slicer 的目录 2023/05/11 15:45 <DIR> . 2023/05/11 13:45 <DIR> .. 2023/05/11 15:45 873,224 1_1_01. wxs_0.wav 2023/05/11 15:45 934,964 1_1_01. wxs_1.wav 2023/05/11 15:45 1,039,040 1_1_01. wxs_10.wav 2023/05/11 15:45 1,391,840 1_1_01. wxs_11.wav 2023/05/11 15:45 2,272,076 1_1_01. wxs_12.wav 2023/05/11 15:45 2,637,224 1_1_01. wxs_13.wav 2023/05/11 15:45 1,476,512 1_1_01. wxs_14.wav 2023/05/11 15:45 1,044,332 1_1_01. wxs_15.wav 2023/05/11 15:45 1,809,908 1_1_01. wxs_16.wav 2023/05/11 15:45 887,336 1_1_01. wxs_17.wav 2023/05/11 15:45 952,604 1_1_01. wxs_18.wav 2023/05/11 15:45 989,648 1_1_01. wxs_19.wav 2023/05/11 15:45 957,896 1_1_01. wxs_2.wav 2023/05/11 15:45 231,128 1_1_01. wxs_20.wav 2023/05/11 15:45 1,337,156 1_1_01. wxs_3.wav 2023/05/11 15:45 1,308,932 1_1_01. wxs_4.wav 2023/05/11 15:45 1,035,512 1_1_01. wxs_5.wav 2023/05/11 15:45 2,388,500 1_1_01. wxs_6.wav 2023/05/11 15:45 2,952,980 1_1_01. wxs_7.wav 2023/05/11 15:45 929,672 1_1_01. wxs_8.wav 2023/05/11 15:45 878,516 1_1_01. wxs_9.wav 2023/05/11 15:45 963,188 1_1_02. qad_(Vocals)_(Vocals)_0.wav 2023/05/11 15:45 901,448 1_1_02. qad_(Vocals)_(Vocals)_1.wav 2023/05/11 15:45 1,411,244 1_1_02. qad_(Vocals)_(Vocals)_10.wav 2023/05/11 15:45 2,070,980 1_1_02. qad_(Vocals)_(Vocals)_11.wav 2023/05/11 15:45 2,898,296 1_1_02. qad_(Vocals)_(Vocals)_12.wav 2023/05/11 15:45 885,572 1_1_02. qad_(Vocals)_(Vocals)_13.wav 2023/05/11 15:45 841,472 1_1_02. qad_(Vocals)_(Vocals)_14.wav 2023/05/11 15:45 876,752 1_1_02. qad_(Vocals)_(Vocals)_15.wav 2023/05/11 15:45 1,091,960 1_1_02. qad_(Vocals)_(Vocals)_16.wav 2023/05/11 15:45 1,188,980 1_1_02. qad_(Vocals)_(Vocals)_17.wav 2023/05/11 15:45 1,446,524 1_1_02. qad_(Vocals)_(Vocals)_18.wav 2023/05/11 15:45 924,380 1_1_02. qad_(Vocals)_(Vocals)_19.wav 2023/05/11 15:45 255,824 1_1_02. qad_(Vocals)_(Vocals)_2.wav 2023/05/11 15:45 1,718,180 1_1_02. qad_(Vocals)_(Vocals)_20.wav 2023/05/11 15:45 2,070,980 1_1_02. qad_(Vocals)_(Vocals)_21.wav 2023/05/11 15:45 2,827,736 1_1_02. qad_(Vocals)_(Vocals)_22.wav 2023/05/11 15:45 862,640 1_1_02. qad_(Vocals)_(Vocals)_23.wav 2023/05/11 15:45 1,628,216 1_1_02. qad_(Vocals)_(Vocals)_24.wav 2023/05/11 15:45 1,626,452 1_1_02. qad_(Vocals)_(Vocals)_25.wav 2023/05/11 15:45 1,499,444 1_1_02. qad_(Vocals)_(Vocals)_26.wav 2023/05/11 15:45 1,303,640 1_1_02. qad_(Vocals)_(Vocals)_27.wav 2023/05/11 15:45 998,468 1_1_02. qad_(Vocals)_(Vocals)_28.wav 2023/05/11 15:45 781,496 1_1_02. qad_(Vocals)_(Vocals)_3.wav 2023/05/11 15:45 1,368,908 1_1_02. qad_(Vocals)_(Vocals)_4.wav 2023/05/11 15:45 892,628 1_1_02. qad_(Vocals)_(Vocals)_5.wav 2023/05/11 15:45 1,386,548 1_1_02. qad_(Vocals)_(Vocals)_6.wav 2023/05/11 15:45 883,808 1_1_02. qad_(Vocals)_(Vocals)_7.wav 2023/05/11 15:45 952,604 1_1_02. qad_(Vocals)_(Vocals)_8.wav 2023/05/11 15:45 1,303,640 1_1_02. qad_(Vocals)_(Vocals)_9.wav 2023/05/11 15:45 1,354,796 1_1_03. hs_(Vocals)_(Vocals)_0.wav 2023/05/11 15:45 1,344,212 1_1_03. hs_(Vocals)_(Vocals)_1.wav 2023/05/11 15:45 1,305,404 1_1_03. hs_(Vocals)_(Vocals)_10.wav 2023/05/11 15:45 1,291,292 1_1_03. hs_(Vocals)_(Vocals)_11.wav 2023/05/11 15:45 1,338,920 1_1_03. hs_(Vocals)_(Vocals)_12.wav 2023/05/11 15:45 1,093,724 1_1_03. hs_(Vocals)_(Vocals)_13.wav 2023/05/11 15:45 1,375,964 1_1_03. hs_(Vocals)_(Vocals)_14.wav 2023/05/11 15:45 1,409,480 1_1_03. hs_(Vocals)_(Vocals)_15.wav 2023/05/11 15:45 1,481,804 1_1_03. hs_(Vocals)_(Vocals)_16.wav 2023/05/11 15:45 2,247,380 1_1_03. hs_(Vocals)_(Vocals)_17.wav 2023/05/11 15:45 1,312,460 1_1_03. hs_(Vocals)_(Vocals)_18.wav 2023/05/11 15:45 1,428,884 1_1_03. hs_(Vocals)_(Vocals)_19.wav 2023/05/11 15:45 1,051,388 1_1_03. hs_(Vocals)_(Vocals)_2.wav 2023/05/11 15:45 1,377,728 1_1_03. hs_(Vocals)_(Vocals)_20.wav 2023/05/11 15:45 1,485,332 1_1_03. hs_(Vocals)_(Vocals)_21.wav 2023/05/11 15:45 897,920 1_1_03. hs_(Vocals)_(Vocals)_22.wav 2023/05/11 15:45 1,591,172 1_1_03. hs_(Vocals)_(Vocals)_23.wav 2023/05/11 15:45 920,852 1_1_03. hs_(Vocals)_(Vocals)_24.wav 2023/05/11 15:45 1,046,096 1_1_03. hs_(Vocals)_(Vocals)_25.wav 2023/05/11 15:45 730,340 1_1_03. hs_(Vocals)_(Vocals)_26.wav 2023/05/11 15:45 1,383,020 1_1_03. hs_(Vocals)_(Vocals)_3.wav 2023/05/11 15:45 1,188,980 1_1_03. hs_(Vocals)_(Vocals)_4.wav 2023/05/11 15:45 1,003,760 1_1_03. hs_(Vocals)_(Vocals)_5.wav 2023/05/11 15:45 1,243,664 1_1_03. hs_(Vocals)_(Vocals)_6.wav 2023/05/11 15:45 845,000 1_1_03. hs_(Vocals)_(Vocals)_7.wav 2023/05/11 15:45 892,628 1_1_03. hs_(Vocals)_(Vocals)_8.wav 2023/05/11 15:45 539,828 1_1_03. hs_(Vocals)_(Vocals)_9.wav 2023/05/11 15:45 725,048 1_1_04. hope_(Vocals)_(Vocals)_0.wav 2023/05/11 15:45 1,023,164 1_1_04. hope_(Vocals)_(Vocals)_1.wav 2023/05/11 15:45 202,904 1_1_04. hope_(Vocals)_(Vocals)_10.wav 2023/05/11 15:45 659,780 1_1_04. hope_(Vocals)_(Vocals)_11.wav 2023/05/11 15:45 1,017,872 1_1_04. hope_(Vocals)_(Vocals)_12.wav 2023/05/11 15:45 1,495,916 1_1_04. hope_(Vocals)_(Vocals)_13.wav 2023/05/11 15:45 1,665,260 1_1_04. hope_(Vocals)_(Vocals)_14.wav 2023/05/11 15:45 675,656 1_1_04. hope_(Vocals)_(Vocals)_15.wav 2023/05/11 15:45 1,187,216 1_1_04. hope_(Vocals)_(Vocals)_16.wav 2023/05/11 15:45 1,201,328 1_1_04. hope_(Vocals)_(Vocals)_17.wav 2023/05/11 15:45 1,368,908 1_1_04. hope_(Vocals)_(Vocals)_18.wav 2023/05/11 15:45 1,462,400 1_1_04. hope_(Vocals)_(Vocals)_19.wav 2023/05/11 15:45 963,188 1_1_04. hope_(Vocals)_(Vocals)_2.wav 2023/05/11 15:45 1,121,948 1_1_04. hope_(Vocals)_(Vocals)_20.wav 2023/05/11 15:45 165,860 1_1_04. hope_(Vocals)_(Vocals)_21.wav 2023/05/11 15:45 1,116,656 1_1_04. hope_(Vocals)_(Vocals)_3.wav 2023/05/11 15:45 622,736 1_1_04. hope_(Vocals)_(Vocals)_4.wav 2023/05/11 15:45 1,349,504 1_1_04. hope_(Vocals)_(Vocals)_5.wav 2023/05/11 15:45 984,356 1_1_04. hope_(Vocals)_(Vocals)_6.wav 2023/05/11 15:45 2,104,496 1_1_04. hope_(Vocals)_(Vocals)_7.wav 2023/05/11 15:45 1,762,280 1_1_04. hope_(Vocals)_(Vocals)_8.wav 2023/05/11 15:45 1,116,656 1_1_04. hope_(Vocals)_(Vocals)_9.wav 2023/05/11 15:45 1,114,892 1_1_05. kamen_(Vocals)_(Vocals)_0.wav 2023/05/11 15:45 874,988 1_1_05. kamen_(Vocals)_(Vocals)_1.wav 2023/05/11 15:45 1,400,660 1_1_05. kamen_(Vocals)_(Vocals)_10.wav 2023/05/11 15:45 943,784 1_1_05. kamen_(Vocals)_(Vocals)_11.wav 2023/05/11 15:45 1,351,268 1_1_05. kamen_(Vocals)_(Vocals)_12.wav 2023/05/11 15:45 1,476,512 1_1_05. kamen_(Vocals)_(Vocals)_13.wav 2023/05/11 15:45 933,200 1_1_05. kamen_(Vocals)_(Vocals)_14.wav 2023/05/11 15:45 1,388,312 1_1_05. kamen_(Vocals)_(Vocals)_15.wav 2023/05/11 15:45 1,012,580 1_1_05. kamen_(Vocals)_(Vocals)_16.wav 2023/05/11 15:45 1,365,380 1_1_05. kamen_(Vocals)_(Vocals)_17.wav 2023/05/11 15:45 1,614,104 1_1_05. kamen_(Vocals)_(Vocals)_18.wav 2023/05/11 15:45 1,582,352 1_1_05. kamen_(Vocals)_(Vocals)_19.wav 2023/05/11 15:45 949,076 1_1_05. kamen_(Vocals)_(Vocals)_2.wav 2023/05/11 15:45 1,402,424 1_1_05. kamen_(Vocals)_(Vocals)_20.wav 2023/05/11 15:45 1,268,360 1_1_05. kamen_(Vocals)_(Vocals)_21.wav 2023/05/11 15:45 1,016,108 1_1_05. kamen_(Vocals)_(Vocals)_22.wav 2023/05/11 15:45 1,065,500 1_1_05. kamen_(Vocals)_(Vocals)_3.wav 2023/05/11 15:45 874,988 1_1_05. kamen_(Vocals)_(Vocals)_4.wav 2023/05/11 15:45 954,368 1_1_05. kamen_(Vocals)_(Vocals)_5.wav 2023/05/11 15:45 1,049,624 1_1_05. kamen_(Vocals)_(Vocals)_6.wav 2023/05/11 15:45 878,516 1_1_05. kamen_(Vocals)_(Vocals)_7.wav 2023/05/11 15:45 1,019,636 1_1_05. kamen_(Vocals)_(Vocals)_8.wav 2023/05/11 15:45 1,383,020 1_1_05. kamen_(Vocals)_(Vocals)_9.wav 2023/05/11 15:45 1,005,524 1_1_06. ctrl_(Vocals)_(Vocals)_0.wav 2023/05/11 15:45 1,090,196 1_1_06. ctrl_(Vocals)_(Vocals)_1.wav 2023/05/11 15:45 84,716 1_1_06. ctrl_(Vocals)_(Vocals)_10.wav 2023/05/11 15:45 857,348 1_1_06. ctrl_(Vocals)_(Vocals)_11.wav 2023/05/11 15:45 991,412 1_1_06. ctrl_(Vocals)_(Vocals)_12.wav 2023/05/11 15:45 1,121,948 1_1_06. ctrl_(Vocals)_(Vocals)_13.wav 2023/05/11 15:45 931,436 1_1_06. ctrl_(Vocals)_(Vocals)_14.wav 2023/05/11 15:45 3,129,380 1_1_06. ctrl_(Vocals)_(Vocals)_15.wav 2023/05/11 15:45 6,202,268 1_1_06. ctrl_(Vocals)_(Vocals)_16.wav 2023/05/11 15:45 1,457,108 1_1_06. ctrl_(Vocals)_(Vocals)_17.wav 2023/05/11 15:45 1,046,096 1_1_06. ctrl_(Vocals)_(Vocals)_2.wav 2023/05/11 15:45 956,132 1_1_06. ctrl_(Vocals)_(Vocals)_3.wav 2023/05/11 15:45 1,286,000 1_1_06. ctrl_(Vocals)_(Vocals)_4.wav 2023/05/11 15:45 804,428 1_1_06. ctrl_(Vocals)_(Vocals)_5.wav 2023/05/11 15:45 1,337,156 1_1_06. ctrl_(Vocals)_(Vocals)_6.wav 2023/05/11 15:45 1,372,436 1_1_06. ctrl_(Vocals)_(Vocals)_7.wav 2023/05/11 15:45 2,954,744 1_1_06. ctrl_(Vocals)_(Vocals)_8.wav 2023/05/11 15:45 6,112,304 1_1_06. ctrl_(Vocals)_(Vocals)_9.wav 140 个文件 183,026,452 字节至此,数据切分顺利完成。开始训练万事俱备,只差训练,首先配置so-vits-svc环境,请移步:AI天后,在线飙歌,人工智能AI孙燕姿模型应用实践,复刻《遥远的歌》,原唱晴子(Python3.10),囿于篇幅,这里不再赘述。 随后将切分后的数据集放在项目根目录的dataset\_raw/yebei文件夹,如果没有yebei文件夹,请进行创建。 随后构建训练配置文件:{ "train": { "log_interval": 200, "eval_interval": 800, "seed": 1234, "epochs": 10000, "learning_rate": 0.0001, "betas": [ "eps": 1e-09, "batch_size": 6, "fp16_run": false, "lr_decay": 0.999875, "segment_size": 10240, "init_lr_ratio": 1, "warmup_epochs": 0, "c_mel": 45, "c_kl": 1.0, "use_sr": true, "max_speclen": 512, "port": "8001", "keep_ckpts": 10, "all_in_mem": false "data": { "training_files": "filelists/train.txt", "validation_files": "filelists/val.txt", "max_wav_value": 32768.0, "sampling_rate": 44100, "filter_length": 2048, "hop_length": 512, "win_length": 2048, "n_mel_channels": 80, "mel_fmin": 0.0, "mel_fmax": 22050 "model": { "inter_channels": 192, "hidden_channels": 192, "filter_channels": 768, "n_heads": 2, "n_layers": 6, "kernel_size": 3, "p_dropout": 0.1, "resblock": "1", "resblock_kernel_sizes": [ "resblock_dilation_sizes": [ "upsample_rates": [ "upsample_initial_channel": 512, "upsample_kernel_sizes": [ "n_layers_q": 3, "use_spectral_norm": false, "gin_channels": 768, "ssl_dim": 768, "n_speakers": 1 "spk": { "yebei": 0 }这里epochs是指对整个训练集进行一次完整的训练。具体来说,每个epoch包含多个训练步骤,每个训练步骤会从训练集中抽取一个小批量的数据进行训练,并更新模型的参数。需要调整的参数是batch\_size,如果显存不够,需要往下调整,否则也会“爆显存”,如果训练过程中出现了下面这个错误:torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 8.00 GiB total capacity; 6.86 GiB already allocated; 0 bytes free; 7.25 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF那么就说明显存已经不够用了。 最后,运行命令开始训练:python3 train.py -c configs/config.json -m 44k终端会返回训练过程:D:\work\so-vits-svc\workenv\lib\site-packages\torch\optim\lr_scheduler.py:139: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate warnings.warn("Detected call of `lr_scheduler.step()` before `optimizer.step()`. " D:\work\so-vits-svc\workenv\lib\site-packages\torch\functional.py:641: UserWarning: stft with return_complex=False is deprecated. In a future pytorch release, stft will return complex tensors for all inputs, and return_complex=False will raise an error. Note: you can still call torch.view_as_real on the complex output to recover the old return format. (Triggered internally at C:\actions-runner\_work\pytorch\pytorch\builder\windows\pytorch\aten\src\ATen\native\SpectralOps.cpp:867.) return _VF.stft(input, n_fft, hop_length, win_length, window, # type: ignore[attr-defined] INFO:torch.nn.parallel.distributed:Reducer buckets have been rebuilt in this iteration. D:\work\so-vits-svc\workenv\lib\site-packages\torch\autograd\__init__.py:200: UserWarning: Grad strides do not match bucket view strides. This may indicate grad was not created according to the gradient layout contract, or that the param's strides changed since DDP was constructed. This is not an error, but may impair performance. grad.sizes() = [32, 1, 4], strides() = [4, 1, 1] bucket_view.sizes() = [32, 1, 4], strides() = [4, 4, 1] (Triggered internally at C:\actions-runner\_work\pytorch\pytorch\builder\windows\pytorch\torch\csrc\distributed\c10d\reducer.cpp:337.) Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass INFO:torch.nn.parallel.distributed:Reducer buckets have been rebuilt in this iteration. INFO:44k:====> Epoch: 274, cost 39.02 s INFO:44k:====> Epoch: 275, cost 17.47 s INFO:44k:====> Epoch: 276, cost 17.74 s INFO:44k:====> Epoch: 277, cost 17.43 s INFO:44k:====> Epoch: 278, cost 17.59 s INFO:44k:====> Epoch: 279, cost 17.82 s INFO:44k:====> Epoch: 280, cost 17.64 s INFO:44k:====> Epoch: 281, cost 17.63 s INFO:44k:Train Epoch: 282 [65%] INFO:44k:Losses: [1.8697402477264404, 3.029414415359497, 11.415563583374023, 23.37869644165039, 0.2702481746673584], step: 6600, lr: 9.637943809624507e-05, reference_loss: 39.963661193847656这里每一次Epoch系统都会返回损失函数等相关信息,训练好的模型存放在项目的logs/44k目录下,模型的后缀名是.pth。结语一般情况下,训练损失率低于50%,并且损失函数在训练集和验证集上都趋于稳定,则可以认为模型已经收敛。收敛的模型就可以为我们所用了,如何使用训练好的模型,请移步:AI天后,在线飙歌,人工智能AI孙燕姿模型应用实践,复刻《遥远的歌》,原唱晴子(Python3.10)。 最后,奉上民谣女神叶蓓的总训练6400次的音色模型,与众乡亲同飨:pan.baidu.com/s/1m3VGc7RktaO5snHw6RPLjQ?pwd=pqkb 提取码:pqkb

AI天后,在线飙歌,人工智能AI孙燕姿模型应用实践,复刻《遥远的歌》,原唱晴子(Python3.10)

忽如一夜春风来,亚洲天后孙燕姿独特而柔美的音色再度响彻华语乐坛,只不过这一次,不是因为她出了新专辑,而是人工智能AI技术对于孙燕姿音色的完美复刻,以大江灌浪之势对华语歌坛诸多经典作品进行了翻唱,还原度令人咋舌,如何做到的? 本次我们借助基于Python3.10的开源库so-vits-svc,让亚洲天后孙燕姿帮我们免费演唱喜欢的歌曲,实现点歌自由。 so-vits-svc是基于VITS的开源项目,VITS(Variational Inference with adversarial learning for end-to-end Text-to-Speech)是一种结合变分推理(variational inference)、标准化流(normalizing flows)和对抗训练的高表现力语音合成模型。VITS通过隐变量而非频谱串联起来语音合成中的声学模型和声码器,在隐变量上进行随机建模并利用随机时长预测器,提高了合成语音的多样性,输入同样的文本,能够合成不同声调和韵律的语音。环境配置首先确保本机已经安装好Python3.10的开发环境,随后使用Git命令克隆项目:git clone https://github.com/svc-develop-team/so-vits-svc.git随后进入项目的目录:cd so-vits-svc接着安装依赖,如果是Linux或者Mac系统,运行命令:pip install -r requirements.txt如果是Windows用户,需要使用Win系统专用的依赖文件:pip install -r requirements_win.txt依赖库安装成功之后,在项目的根目录运行命令,启动服务:python webUI.py程序返回:PS D:\so-vits-svc> python .\webUI.py DEBUG:charset_normalizer:Encoding detection: ascii is most likely the one. C:\Users\zcxey\AppData\Roaming\Python\Python310\site-packages\gradio\deprecation.py:43: UserWarning: You have unused kwarg parameters in UploadButton, please remove them: {'variant': 'primary'} warnings.warn( DEBUG:asyncio:Using proactor: IocpProactor Running on local URL: http://127.0.0.1:7860 To create a public link, set `share=True` in `launch()`.说明服务已经正常启动了,这里so-vits-svc会在后台运行一个基于Flask框架的web服务,端口号是7860,此时访问本地的网址:127.0.0.1:7860: 此时,我们就可以加载模型,模型训练先按下不表,这里先使用已经训练好的孙燕姿音色模型:链接:https://pan.baidu.com/s/1RwgRe6s4HCA2eNI5sxHZ9A?pwd=7b4a 提取码:7b4a下载模型文件之后,将模型文件放入logs/44k目录:D:\so-vits-svc\logs\44k>dir 驱动器 D 中的卷是 新加卷 卷的序列号是 9824-5798 D:\so-vits-svc\logs\44k 的目录 2023/05/10 12:31 <DIR> . 2023/05/10 11:49 <DIR> .. 2023/04/08 15:22 542,178,141 G_27200.pth 2023/04/08 15:54 15,433,721 kmeans_10000.pt 2023/05/10 11:49 0 put_pretrained_model_here 3 个文件 557,611,862 字节 2 个目录 475,872,493,568 可用字节 D:\so-vits-svc\logs\44k>接着将模型的配置文件config.js放入configs目录:D:\so-vits-svc\configs>dir 驱动器 D 中的卷是 新加卷 卷的序列号是 9824-5798 D:\so-vits-svc\configs 的目录 2023/05/10 11:49 <DIR> . 2023/05/10 12:23 <DIR> .. 2023/04/08 12:33 2,118 config.json 1 个文件 2,118 字节 2 个目录 475,872,493,568 可用字节 D:\so-vits-svc\configs>随后,在页面中点击加载模型即可,这里环境就配置好了。原始歌曲处理(人声和伴奏分离)如果想要使用孙燕姿的模型进行推理,让孙燕姿同学唱别的歌手的歌,首先需要一段已经准备好的声音范本,然后使用模型把原来的音色换成孙燕姿模型训练好的音色,有些类似Stable-Diffusion的图像风格迁移,只不过是将绘画风格替换为音色和音准。这里我们使用晴子的《遥远的歌》,这首歌曲调悠扬,如诉如泣,和孙燕姿婉转的音色正好匹配。好吧,其实是因为这首歌比较简单,方便新手练习。 需要注意的是,模型推理过程中,需要的歌曲样本不应该包含伴奏,因为伴奏属于“噪音”,会影响模型的推理效果,因为我们替换的是歌手的“声音”,并非伴奏。 这里我们选择使用开源库Spleeter来对原歌曲进行人声和伴奏分离,首先安装spleeter:pip3 install spleeter --user接着运行命令,对《遥远的歌》进行分离操作:spleeter separate -o d:/output/ -p spleeter:2stems d:/遥远的歌.mp3这里-o代表输出目录,-p代表选择的分离模型,最后是要分离的素材。首次运行会比较慢,因为spleeter会下载预训练模型,体积在1.73g左右,运行完毕后,会在输出目录生成分离后的音轨文件:C:\Users\zcxey\Downloads\test>dir 驱动器 C 中的卷是 Windows 卷的序列号是 5607-6354 C:\Users\zcxey\Downloads\test 的目录 2023/05/09 13:17 <DIR> . 2023/05/10 20:57 <DIR> .. 2023/05/09 13:17 26,989,322 accompaniment.wav 2023/05/09 13:17 26,989,322 vocals.wav 2 个文件 53,978,644 字节 2 个目录 182,549,413,888 可用字节其中vocals.wav为晴子的清唱声音,而accompaniment.wav则为伴奏。 关于spleeter更多的操作,请移步至:人工智能AI库Spleeter免费人声和背景音乐分离实践(Python3.10) , 这里不再赘述。 至此,原始歌曲就处理好了。歌曲推理此时,将晴子的清唱声音vocals.wav文件添加到页面中: 接着就是参数的调整: 这里推理歌曲会有两个问题,就是声音沙哑和跑调,二者必居其一。 F0均值滤波(池化)参数开启后可以有效改善沙哑问题,但有概率导致跑调,而降低该值则可以减少跑调的概率,但又会出现声音沙哑的问题。 基本上,推理过程就是在这两个参数之间不断地调整。 所以每一次推理都需要认真的听一下歌曲有什么问题,然后调整参数的值,这里我最终的参数调整结果如上图所示。 推理出来的歌曲同样也是wav格式,此时我们将推理的清唱声音和之前分离出来的伴奏音乐accompaniment.wav进行合并即可,这里推荐使用FFMPEG:ffmpeg -f concat -i <( for f in *.wav; do echo "file '$(pwd)/$f'"; done ) output.wav该命令可以把推理的人声wav和背景音乐wav合并为一个output.wav歌曲,也就是我们最终的作品。结语藉此,我们就完成了自由点歌让天后演唱的任务,如果后期配上画面和歌词的字幕,不失为一个精美的AI艺术品,在Youtube(B站)搜索关键字:刘悦的技术博客,即可欣赏最终的成品歌曲,欢迎诸君品鉴。

Python3.10动态修改Windows系统(win10/win11)本地IP地址(静态IP)

一般情况下,局域网里的终端比如本地服务器设置静态IP的好处是可以有效减少网络连接时间,原因是过程中省略了每次联网后从DHCP服务器获取IP地址的流程,缺点是容易引发IP地址的冲突,当然,还有操作层面的繁琐,如果想要切换静态IP地址,就得去网络连接设置中手动操作,本次我们使用Python3.10动态地修改电脑的静态IP地址。获取多网卡配置一个网卡对应一个静态IP地址,但机器上未必只有一个网卡,所以如果想动态切换,必须要指定网卡名称,Win系统中通过ipconfig命令来获取当前系统的网卡信息:ipconfig系统返回:PS C:\Users\liuyue\h2102-a\videopro> ipconfig Windows IP 配置 以太网适配器 以太网: 连接特定的 DNS 后缀 . . . . . . . : 本地链接 IPv6 地址. . . . . . . . : fe80::a216:f22a:52a:3388%4 IPv4 地址 . . . . . . . . . . . . : 192.168.1.104 子网掩码 . . . . . . . . . . . . : 255.255.255.0 默认网关. . . . . . . . . . . . . : 192.168.1.1 以太网适配器 以太网 2: 连接特定的 DNS 后缀 . . . . . . . : IPv6 地址 . . . . . . . . . . . . : fdb2:2c26:f4e4:0:7703:1e08:e622:2f0 临时 IPv6 地址. . . . . . . . . . : fdb2:2c26:f4e4:0:717c:b59e:b6cd:51b2 本地链接 IPv6 地址. . . . . . . . : fe80::2645:f265:ad72:c751%16 IPv4 地址 . . . . . . . . . . . . : 192.168.0.118 子网掩码 . . . . . . . . . . . . : 255.255.255.0 默认网关. . . . . . . . . . . . . : 以太网适配器 vEthernet (Default Switch): 连接特定的 DNS 后缀 . . . . . . . : 本地链接 IPv6 地址. . . . . . . . : fe80::3ece:9b38:2572:4e33%18 IPv4 地址 . . . . . . . . . . . . : 172.31.16.1 子网掩码 . . . . . . . . . . . . : 255.255.240.0 默认网关. . . . . . . . . . . . . :如果想通过Python来获取网卡信息,则需要在脚本中运行ipconfig命令,构建change\_ip.py脚本:import os,re class IpManage: def __init__(self): self.ip_list = self.get_ip() def get_ip(self): result = os.popen('ipconfig') res = result.read() resultlist = re.findall('''(?<=以太网适配器 ).*?(?=:)|(?<=无线局域网适配器 ).*?(?=:)''', res) print(resultlist) return resultlist if __name__ == '__main__': IpManage()这里通过os模块的popen方法来运行ipconfig命令,随后再使用正则来匹配网卡名称,最后将匹配到的网卡列表赋值给实例属性,程序返回:['以太网', '以太网 2', 'vEthernet (Default Switch)'] [Finished in 394ms]至此,三块网卡的名称就获取到了。动态切换静态IP接下来就是通过Python脚本来动态切换指定网卡的静态IP地址了,Windows系统通过netsh命令来指定IP地址:netsh interface ip set address name=以太网 static 192.168.201.137 255.255.248.0 192.168.200.1这里name参数是网卡名称,后来三个地址分别代表静态IP地址,子网掩码以及网关地址。 这里将第一块网卡的静态IP地址设置为192.168.201.137,子网掩码是255.255.248.0,网关地址为192.168.200.1。 随后在Windows的网络连接设置中进行查看: 发现已经设置好了,随后再手动修改为自动获得IP地址选项。 下面通过Python脚本进行设置:def set_ip(self,name,ip="192.168.201.137",mask="255.255.248.0",gateway="192.168.200.1"): result = os.popen(f"netsh interface ip set address name={name} static {ip} {mask} {gateway}") res = result.read() print(res)这里添加一个实例方法来设置ip地址,同样使用popen方法来运行命令,随后进行调用:if __name__ == '__main__': im = IpManage() im.set_ip(im.ip_list[0])这里将第一块网卡的IP地址进行指定操作。 完整代码:import os,re class IpManage: def __init__(self): self.ip_list = self.get_ip() def set_ip(self,name,ip="192.168.201.137",mask="255.255.248.0",gateway="192.168.200.1"): result = os.popen(f"netsh interface ip set address name={name} static {ip} {mask} {gateway}") res = result.read() def get_ip(self): result = os.popen('ipconfig') res = result.read() resultlist = re.findall('''(?<=以太网适配器 ).*?(?=:)|(?<=无线局域网适配器 ).*?(?=:)''', res) print(resultlist) return resultlist if __name__ == '__main__': im = IpManage() im.set_ip(im.ip_list[0])结语藉此,我们就可以通过Python3.10动态地配置本地网卡的静态IP地址,也可以理解为是一种Python自动化流程,静态IP地址可以让IP地址语义化,对于数据中心、网站、银行的结算端口等往往需要静态IP,与此同时,也省却了手动配置静态IP的繁琐过程。

极速进化,光速转录,C++版本人工智能实时语音转文字(字幕/语音识别)Whisper.cpp实践

业界良心OpenAI开源的Whisper模型是开源语音转文字领域的执牛耳者,白璧微瑕之处在于无法通过苹果M芯片优化转录效率,Whisper.cpp 则是 Whisper 模型的 C/C++ 移植版本,它具有无依赖项、内存使用量低等特点,重要的是增加了 Core ML 支持,完美适配苹果M系列芯片。 Whisper.cpp的张量运算符针对苹果M芯片的 CPU 进行了大量优化,根据计算大小,使用 Arm Neon SIMD instrisics 或 CBLAS Accelerate 框架例程,后者对于更大的尺寸特别有效,因为 Accelerate 框架可以使用苹果M系列芯片中提供的专用 AMX 协处理器。配置Whisper.cpp老规矩,运行git命令来克隆Whisper.cpp项目:git clone https://github.com/ggerganov/whisper.cpp.git随后进入项目的目录:cd whisper.cpp项目默认的基础模型不支持中文,这里推荐使用medium模型,通过shell脚本进行下载:bash ./models/download-ggml-model.sh medium下载完成后,会在项目的models目录保存ggml-medium.bin模型文件,大小为1.53GB:whisper.cpp git:(master) cd models ➜ models git:(master) ll total 3006000 -rw-r--r-- 1 liuyue staff 3.2K 4 21 07:21 README.md -rw-r--r-- 1 liuyue staff 7.2K 4 21 07:21 convert-h5-to-ggml.py -rw-r--r-- 1 liuyue staff 9.2K 4 21 07:21 convert-pt-to-ggml.py -rw-r--r-- 1 liuyue staff 13K 4 21 07:21 convert-whisper-to-coreml.py drwxr-xr-x 4 liuyue staff 128B 4 22 00:33 coreml-encoder-medium.mlpackage -rwxr-xr-x 1 liuyue staff 2.1K 4 21 07:21 download-coreml-model.sh -rw-r--r-- 1 liuyue staff 1.3K 4 21 07:21 download-ggml-model.cmd -rwxr-xr-x 1 liuyue staff 2.0K 4 21 07:21 download-ggml-model.sh -rw-r--r-- 1 liuyue staff 562K 4 21 07:21 for-tests-ggml-base.bin -rw-r--r-- 1 liuyue staff 573K 4 21 07:21 for-tests-ggml-base.en.bin -rw-r--r-- 1 liuyue staff 562K 4 21 07:21 for-tests-ggml-large.bin -rw-r--r-- 1 liuyue staff 562K 4 21 07:21 for-tests-ggml-medium.bin -rw-r--r-- 1 liuyue staff 573K 4 21 07:21 for-tests-ggml-medium.en.bin -rw-r--r-- 1 liuyue staff 562K 4 21 07:21 for-tests-ggml-small.bin -rw-r--r-- 1 liuyue staff 573K 4 21 07:21 for-tests-ggml-small.en.bin -rw-r--r-- 1 liuyue staff 562K 4 21 07:21 for-tests-ggml-tiny.bin -rw-r--r-- 1 liuyue staff 573K 4 21 07:21 for-tests-ggml-tiny.en.bin -rwxr-xr-x 1 liuyue staff 1.4K 4 21 07:21 generate-coreml-interface.sh -rwxr-xr-x@ 1 liuyue staff 769B 4 21 07:21 generate-coreml-model.sh -rw-r--r-- 1 liuyue staff 1.4G 3 22 16:04 ggml-medium.bin模型下载以后,在根目录编译可执行文件:make程序返回:➜ whisper.cpp git:(master) make I whisper.cpp build info: I UNAME_S: Darwin I UNAME_P: arm I UNAME_M: arm64 I CFLAGS: -I. -O3 -DNDEBUG -std=c11 -fPIC -pthread -DGGML_USE_ACCELERATE I CXXFLAGS: -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread I LDFLAGS: -framework Accelerate I CC: Apple clang version 14.0.3 (clang-1403.0.22.14.1) I CXX: Apple clang version 14.0.3 (clang-1403.0.22.14.1) c++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread examples/bench/bench.cpp ggml.o whisper.o -o bench -framework Accelerate至此,Whisper.cpp就配置好了。牛刀小试现在我们来测试一段语音,看看效果:./main -osrt -m ./models/ggml-medium.bin -f samples/jfk.wav这行命令的含义是通过刚才下载ggml-medium.bin模型来对项目中的samples/jfk.wav语音文件进行识别,这段语音是遇刺的美国总统肯尼迪的著名演讲,程序返回:➜ whisper.cpp git:(master) ./main -osrt -m ./models/ggml-medium.bin -f samples/jfk.wav whisper_init_from_file_no_state: loading model from './models/ggml-medium.bin' whisper_model_load: loading model whisper_model_load: n_vocab = 51865 whisper_model_load: n_audio_ctx = 1500 whisper_model_load: n_audio_state = 1024 whisper_model_load: n_audio_head = 16 whisper_model_load: n_audio_layer = 24 whisper_model_load: n_text_ctx = 448 whisper_model_load: n_text_state = 1024 whisper_model_load: n_text_head = 16 whisper_model_load: n_text_layer = 24 whisper_model_load: n_mels = 80 whisper_model_load: f16 = 1 whisper_model_load: type = 4 whisper_model_load: mem required = 1725.00 MB (+ 43.00 MB per decoder) whisper_model_load: adding 1608 extra tokens whisper_model_load: model ctx = 1462.35 MB whisper_model_load: model size = 1462.12 MB whisper_init_state: kv self size = 42.00 MB whisper_init_state: kv cross size = 140.62 MB system_info: n_threads = 4 / 10 | AVX = 0 | AVX2 = 0 | AVX512 = 0 | FMA = 0 | NEON = 1 | ARM_FMA = 1 | F16C = 0 | FP16_VA = 1 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 0 | VSX = 0 | COREML = 0 | main: processing 'samples/jfk.wav' (176000 samples, 11.0 sec), 4 threads, 1 processors, lang = en, task = transcribe, timestamps = 1 ... [00:00:00.000 --> 00:00:11.000] And so, my fellow Americans, ask not what your country can do for you, ask what you can do for your country. built with Apple clang version 14.0.0 (clang-1400.0.29.202) configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/5.1.2_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-neon libavutil 57. 28.100 / 57. 28.100 libavcodec 59. 37.100 / 59. 37.100 libavformat 59. 27.100 / 59. 27.100 libavdevice 59. 7.100 / 59. 7.100 libavfilter 8. 44.100 / 8. 44.100 libswscale 6. 7.100 / 6. 7.100 libswresample 4. 7.100 / 4. 7.100 libpostproc 56. 6.100 / 56. 6.100 [mp3 @ 0x130e05580] Estimating duration from bitrate, this may be inaccurate Input #0, mp3, from './test1.mp3': Duration: 00:05:41.33, start: 0.000000, bitrate: 48 kb/s Stream #0:0: Audio: mp3, 24000 Hz, mono, fltp, 48 kb/s Stream mapping: Stream #0:0 -> #0:0 (mp3 (mp3float) -> pcm_s16le (native)) Press [q] to stop, [?] for help Output #0, wav, to './test1.wav': Metadata: ISFT : Lavf59.27.100 Stream #0:0: Audio: pcm_s16le ([1][0][0][0] / 0x0001), 16000 Hz, mono, s16, 256 kb/s Metadata: encoder : Lavc59.37.100 pcm_s16le [mp3float @ 0x132004260] overread, skip -6 enddists: -4 -4ed=N/A Last message repeated 1 times [mp3float @ 0x132004260] overread, skip -7 enddists: -1 -1 [mp3float @ 0x132004260] overread, skip -7 enddists: -2 -2 [mp3float @ 0x132004260] overread, skip -7 enddists: -1 -1 [mp3float @ 0x132004260] overread, skip -9 enddists: -2 -2 [mp3float @ 0x132004260] overread, skip -5 enddists: -1 -1 Last message repeated 1 times [mp3float @ 0x132004260] overread, skip -7 enddists: -3 -3 [mp3float @ 0x132004260] overread, skip -8 enddists: -5 -5 [mp3float @ 0x132004260] overread, skip -5 enddists: -2 -2 [mp3float @ 0x132004260] overread, skip -6 enddists: -1 -1 [mp3float @ 0x132004260] overread, skip -7 enddists: -3 -3 [mp3float @ 0x132004260] overread, skip -6 enddists: -2 -2 [mp3float @ 0x132004260] overread, skip -6 enddists: -3 -3 [mp3float @ 0x132004260] overread, skip -7 enddists: -6 -6 [mp3float @ 0x132004260] overread, skip -9 enddists: -6 -6 [mp3float @ 0x132004260] overread, skip -5 enddists: -3 -3 [mp3float @ 0x132004260] overread, skip -5 enddists: -2 -2 [mp3float @ 0x132004260] overread, skip -5 enddists: -3 -3 [mp3float @ 0x132004260] overread, skip -7 enddists: -1 -1 size= 10667kB time=00:05:41.32 bitrate= 256.0kbits/s speed=2.08e+03x video:0kB audio:10666kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.000714%这里将一段五分四十一秒的语音转换为wav文件。 随后运行命令开始转录:./main -osrt -m ./models/ggml-medium.bin -f samples/test1.wav -l zh这里需要加上参数-l,告知程序为中文语音,程序返回:➜ whisper.cpp git:(master) ./main -osrt -m ./models/ggml-medium.bin -f samples/test1.wav -l zh whisper_init_from_file_no_state: loading model from './models/ggml-medium.bin' whisper_model_load: loading model whisper_model_load: n_vocab = 51865 whisper_model_load: n_audio_ctx = 1500 whisper_model_load: n_audio_state = 1024 whisper_model_load: n_audio_head = 16 whisper_model_load: n_audio_layer = 24 whisper_model_load: n_text_ctx = 448 whisper_model_load: n_text_state = 1024 whisper_model_load: n_text_head = 16 whisper_model_load: n_text_layer = 24 whisper_model_load: n_mels = 80 whisper_model_load: f16 = 1 whisper_model_load: type = 4 whisper_model_load: mem required = 1725.00 MB (+ 43.00 MB per decoder) whisper_model_load: adding 1608 extra tokens whisper_model_load: model ctx = 1462.35 MB whisper_model_load: model size = 1462.12 MB whisper_init_state: kv self size = 42.00 MB whisper_init_state: kv cross size = 140.62 MB system_info: n_threads = 4 / 10 | AVX = 0 | AVX2 = 0 | AVX512 = 0 | FMA = 0 | NEON = 1 | ARM_FMA = 1 | F16C = 0 | FP16_VA = 1 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 0 | VSX = 0 | COREML = 0 | main: processing 'samples/test1.wav' (5461248 samples, 341.3 sec), 4 threads, 1 processors, lang = zh, task = transcribe, timestamps = 1 ... [00:00:00.000 --> 00:00:03.340] Hello 大家好,这里是刘越的技术博客。 [00:00:03.340 --> 00:00:05.720] 最近的事情大家都晓得了, [00:00:05.720 --> 00:00:07.880] 某公司技术经理魅上欺下, [00:00:07.880 --> 00:00:10.380] 打工人应对进队,不易快灾, [00:00:10.380 --> 00:00:12.020] 不易壮灾, [00:00:12.020 --> 00:00:14.280] 所谓魅上者必欺下, [00:00:14.280 --> 00:00:16.020] 古人诚不我窃。 [00:00:16.020 --> 00:00:17.360] 技术经理者, [00:00:17.360 --> 00:00:20.160] 公然在聊天群里大玩职场PUA, [00:00:20.160 --> 00:00:22.400] 气焰嚣张,有恃无恐, [00:00:22.400 --> 00:00:23.700] 最终引发众目, [00:00:23.700 --> 00:00:26.500] 嘿嘿,技术经理,团队领导, [00:00:26.500 --> 00:00:29.300] 原来团队领导这四个字是这么用的, [00:00:29.300 --> 00:00:31.540] 奴媚显达,构陷下属, [00:00:31.540 --> 00:00:32.780] 人文巨损, [00:00:32.780 --> 00:00:33.840] 逢迎上意, [00:00:33.840 --> 00:00:34.980] 傲然下欺, [00:00:34.980 --> 00:00:36.080] 装腔作势, [00:00:36.080 --> 00:00:37.180] 极尽投机, [00:00:37.180 --> 00:00:38.320] 负他人之负, [00:00:38.320 --> 00:00:39.620] 康他人之愷, [00:00:39.620 --> 00:00:42.180] 如此者,可谓团队领导也。 [00:00:42.180 --> 00:00:43.980] 中国的所谓传统文化, [00:00:43.980 --> 00:00:45.320] 除了仁义理智性, [00:00:45.320 --> 00:00:46.620] 除了金石子极, [00:00:46.620 --> 00:00:47.820] 除了争争风骨, [00:00:47.820 --> 00:00:49.560] 其实还有很多别的东西, [00:00:49.560 --> 00:00:52.020] 被大家或有意或无意的忽视了, [00:00:52.020 --> 00:00:53.300] 比如功利实用, [00:00:53.300 --> 00:00:54.300] 屈颜附示, [00:00:54.300 --> 00:00:55.360] 以兼至善, [00:00:55.360 --> 00:01:01.000] 官本位和钱规则的传统,在某种程度上,传统文化这没硬币的另一面, [00:01:01.000 --> 00:01:03.900] 才是更需要我们去面对和正视的, [00:01:03.900 --> 00:01:07.140] 我以为,这在目前盛行实惠价值观的时候, [00:01:07.140 --> 00:01:08.940] 提一提还是必要的, [00:01:08.940 --> 00:01:10.240] 有的人说了, [00:01:10.240 --> 00:01:13.740] 在开发群里对领导,非常痛快,非常爽, [00:01:13.740 --> 00:01:17.180] 但是,然后呢,有用吗? [00:01:17.180 --> 00:01:19.260] 倒霉的还不是自己, [00:01:19.260 --> 00:01:22.520] 没错,这就是功利且实用的传统, [00:01:22.520 --> 00:01:28.780] 各种精神,思辨,反抗,愤怒,都抵不过三个字,有用吗? [00:01:28.780 --> 00:01:31.820] 事实上,但凡叫做某种精神的, [00:01:31.820 --> 00:01:33.320] 那就是哲学思辨, [00:01:33.320 --> 00:01:36.220] 就是一种相对无用的思辨和学术, [00:01:36.220 --> 00:01:39.180] 而中国职场有很强的实用传统, [00:01:39.180 --> 00:01:42.140] 但这不是学术思辨,也没有理论构架, [00:01:42.140 --> 00:01:44.380] 仅仅是一种短视的经验论, [00:01:44.380 --> 00:01:47.220] 所以,功利主义,是密尔, [00:01:47.220 --> 00:01:48.980] 编庆的伦理价值学说, [00:01:48.980 --> 00:01:52.700] 强调的是,追求幸福,如何获得最大效用, [00:01:52.700 --> 00:01:55.580] 实用主义,是西方的一个学术流派, [00:01:55.580 --> 00:01:58.260] 比如杜威,胡适,就是代表, [00:01:58.260 --> 00:02:01.180] 实用主义的另一个名字,叫人本主义, [00:02:01.180 --> 00:02:04.780] 意思是,以人作为经验和万物的尺度, [00:02:04.780 --> 00:02:06.080] 换句话说, [00:02:06.080 --> 00:02:09.420] 功利主义,反对的正是那种短视的功利, [00:02:09.420 --> 00:02:13.220] 实用主义,反对的也正是那种凡是看对自己, [00:02:13.220 --> 00:02:15.220] 是不是有利的局限判断, [00:02:15.220 --> 00:02:17.260] 而在中国职场功利, [00:02:17.260 --> 00:02:21.060] 实用的传统中,恰恰是不会有这些理论构架的, [00:02:21.060 --> 00:02:23.700] 并且,不仅没有理论构架, [00:02:23.700 --> 00:02:26.140] 还要对那些无用的,思辨的, [00:02:26.140 --> 00:02:29.980] 纯粹的精神,视如避喜,吃之以鼻, [00:02:29.980 --> 00:02:32.260] 没错,在技术团队里, [00:02:32.260 --> 00:02:35.260] 我们重视技术,重视实用的科学, [00:02:35.260 --> 00:02:38.900] 但是主流职场并不鼓励去搞那些看似无用的东西, [00:02:38.900 --> 00:02:41.380] 比如普通劳动者的合法权益, [00:02:41.380 --> 00:02:43.580] 张义谋的满江红, [00:02:43.580 --> 00:02:45.220] 大家想必也都看了的, [00:02:45.220 --> 00:02:46.820] 人们总觉得很奇怪, [00:02:46.820 --> 00:02:48.300] 为什么那么坏的人, [00:02:48.300 --> 00:02:50.020] 皇帝为啥不罢免他? [00:02:50.020 --> 00:02:53.140] 为什么小人能当权来构陷好人呢? [00:02:53.140 --> 00:02:55.980] 当我们了解了传统文化中的法家思想, [00:02:55.980 --> 00:02:57.300] 就了然了, [00:02:57.300 --> 00:02:59.260] 在法家的思想规则下, [00:02:59.260 --> 00:03:01.660] 小人得是,忠良备辱, [00:03:01.660 --> 00:03:03.140] 事事所必然, [00:03:03.140 --> 00:03:04.900] 因为他一开始的设定, [00:03:04.900 --> 00:03:07.540] 就使得劣币驱逐良币的游戏规则, [00:03:07.540 --> 00:03:09.940] 所以,在这种观念下, [00:03:09.940 --> 00:03:12.460] 古代常见的一种职场智慧就是, [00:03:12.460 --> 00:03:14.820] 自污名节,以求自保, [00:03:14.820 --> 00:03:16.420] 在这种环境下, [00:03:16.420 --> 00:03:17.780] 要想生存, [00:03:17.780 --> 00:03:19.260] 就只有一条出路, [00:03:19.260 --> 00:03:20.900] 那就是依附权力, [00:03:20.900 --> 00:03:23.700] 并且,谁能拥有更大的权力, [00:03:23.700 --> 00:03:25.700] 谁就能生存得更好, [00:03:25.700 --> 00:03:27.500] 如何依附权力呢? [00:03:27.500 --> 00:03:29.180] 那就是现在正在发生的, [00:03:29.180 --> 00:03:31.900] 肆无忌惮的大腕职场PUA, [00:03:31.900 --> 00:03:33.060] 除此之外, [00:03:33.060 --> 00:03:34.340] 这种权力关系, [00:03:34.340 --> 00:03:36.900] 在古代会渗透到方方面面, [00:03:36.900 --> 00:03:40.300] 因为权力系统是一个复杂而高效的运行机器, [00:03:40.300 --> 00:03:42.940] CPU,内存,硬盘, [00:03:42.940 --> 00:03:44.900] 甚至一颗C面底螺丝钉, [00:03:44.900 --> 00:03:47.140] 都是权力机器上的一个环节, [00:03:47.140 --> 00:03:48.060] 于是, [00:03:48.060 --> 00:03:50.420] 官僚体系之外的一切职场人, [00:03:50.420 --> 00:03:52.340] 都会面临一个尴尬的处境, [00:03:52.340 --> 00:03:54.340] 一方面遭遇权力的打压, [00:03:54.340 --> 00:03:55.340] 另一方面, [00:03:55.340 --> 00:03:57.900] 也都会多少尝到权力的甜头, [00:03:57.900 --> 00:03:58.900] 于是乎, [00:03:58.900 --> 00:04:01.420] 权力的细胞渗透到角角落落, [00:04:01.420 --> 00:04:02.980] 即便没有组织权力, [00:04:02.980 --> 00:04:04.620] 也要追求文化权力, [00:04:04.620 --> 00:04:05.500] 父权, [00:04:05.500 --> 00:04:06.380] 夫权, [00:04:06.380 --> 00:04:07.460] 家长权力, [00:04:07.460 --> 00:04:08.580] 宗族权力, [00:04:08.580 --> 00:04:09.660] 老师权力, [00:04:09.660 --> 00:04:10.780] 公司权力, [00:04:10.780 --> 00:04:12.140] 团队领导权力, [00:04:12.140 --> 00:04:13.100] 点点滴滴, [00:04:13.100 --> 00:04:15.580] 滴滴点点,追逐权力, [00:04:15.580 --> 00:04:18.140] 几乎成为人们生活的全部意义, [00:04:18.140 --> 00:04:18.980] 故而, [00:04:18.980 --> 00:04:19.980] 服从权力, [00:04:19.980 --> 00:04:21.180] 服从上级, [00:04:21.180 --> 00:04:22.420] 不得罪同事, [00:04:22.420 --> 00:04:23.660] 不得罪朋友, [00:04:23.660 --> 00:04:25.060] 不得罪陌生人, [00:04:25.060 --> 00:04:26.100] 因为你不知道, [00:04:26.100 --> 00:04:28.260] 他们背后有什么的权力关系, [00:04:28.260 --> 00:04:30.940] 他们又会不会用这个权力来对付你, [00:04:30.940 --> 00:04:31.940] 没错, [00:04:31.940 --> 00:04:34.380] 当我们解构群里那位领导的行为时, [00:04:34.380 --> 00:04:36.220] 我们也在解构我们自己, [00:04:36.220 --> 00:04:37.420] 毫无疑问, [00:04:37.420 --> 00:04:39.380] 对于这位敢于发声的职场人, [00:04:39.380 --> 00:04:41.180] 深安职场底层逻辑的, [00:04:41.180 --> 00:04:43.220] 我们一定能猜到他的结局, [00:04:43.220 --> 00:04:44.700] 他的结局是注定的, [00:04:44.700 --> 00:04:46.220] 同时也是悲哀的, [00:04:46.220 --> 00:04:47.340] 问题是, [00:04:47.340 --> 00:04:48.540] 这样做, [00:04:48.540 --> 00:04:49.660] 值得吗? [00:04:49.660 --> 00:04:52.580] 香港著名导演王家卫拍过一部电影, [00:04:52.580 --> 00:04:54.420] 叫做东邪西毒, [00:04:54.420 --> 00:04:56.340] 电影中有这样一个情节, [00:04:56.340 --> 00:04:59.620] 有个女人的弟弟被太尉府的一群刀客杀了, [00:04:59.620 --> 00:05:00.860] 他想报仇, [00:05:00.860 --> 00:05:02.300] 可自己没有武功, [00:05:02.300 --> 00:05:04.060] 只能请刀客出手, [00:05:04.060 --> 00:05:05.540] 但家里穷没钱, [00:05:05.540 --> 00:05:08.540] 最有价值的资产是一篮子鸡蛋, [00:05:08.540 --> 00:05:09.260] 于是, [00:05:09.260 --> 00:05:10.900] 他提着那一篮子鸡蛋, [00:05:10.900 --> 00:05:13.420] 天天站在刀客剑客们经过的路口, [00:05:13.420 --> 00:05:14.700] 请求他们出手, [00:05:14.700 --> 00:05:16.220] 报仇就是鸡蛋, [00:05:16.220 --> 00:05:17.860] 没有人愿意为了鸡蛋, [00:05:17.860 --> 00:05:20.020] 去单挑太尉府的刀客, [00:05:20.020 --> 00:05:21.460] 除了洪七, [00:05:21.460 --> 00:05:24.260] 洪七独自力战太尉府那帮刀客, [00:05:24.260 --> 00:05:26.780] 所得的报仇是一个鸡蛋, [00:05:26.780 --> 00:05:29.020] 但是洪七付出的代价太大, [00:05:29.020 --> 00:05:30.060] 混战中, [00:05:30.060 --> 00:05:32.700] 洪七被对手砍断了一根手指, [00:05:32.700 --> 00:05:33.820] 为了一个鸡蛋, [00:05:33.820 --> 00:05:35.500] 而失去一只手指, [00:05:35.500 --> 00:05:36.740] 值得吗? [00:05:36.740 --> 00:05:37.860] 不值得, [00:05:37.860 --> 00:05:39.300] 但是我觉得痛快, [00:05:39.300 --> 00:05:40.540] 因為這才是我自己 output_srt: saving output to 'samples/test1.wav.srt' whisper_print_timings: load time = 978.82 ms whisper_print_timings: fallbacks = 0 p / 0 h whisper_print_timings: mel time = 438.81 ms whisper_print_timings: sample time = 980.66 ms / 2343 runs ( 0.42 ms per run) whisper_print_timings: encode time = 31476.10 ms / 13 runs ( 2421.24 ms per run) whisper_print_timings: decode time = 47833.70 ms / 2343 runs ( 20.42 ms per run) whisper_print_timings: total time = 81797.88 ms五分钟的语音,只需要一分钟多一点就可以转录完成,效率满分。 当然,精确度还有待提高,提高精确度可以选择large模型,但转录时间会相应增加。苹果M芯片模型转换基于苹果Mac系统的用户有福了,Whisper.cpp可以通过Core ML在Apple Neural Engine (ANE)上执行编码器推理,这可以比仅使用CPU执行快出三倍以上。首先安装转换依赖:pip install ane_transformers pip install openai-whisper pip install coremltools接着运行转换脚本:./models/generate-coreml-model.sh medium 这里参数即模型的名称。 程序返回:➜ models git:(master) python3 convert-whisper-to-coreml.py --model medium --encoder-only True scikit-learn version 1.2.0 is not supported. Minimum required version: 0.17. Maximum required version: 1.1.2. Disabling scikit-learn conversion API. ModelDimensions(n_mels=80, n_audio_ctx=1500, n_audio_state=1024, n_audio_head=16, n_audio_layer=24, n_vocab=51865, n_text_ctx=448, n_text_state=1024, n_text_head=16, n_text_layer=24) /opt/homebrew/lib/python3.10/site-packages/whisper/model.py:166: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! assert x.shape[1:] == self.positional_embedding.shape, "incorrect audio shape" /opt/homebrew/lib/python3.10/site-packages/whisper/model.py:97: UserWarning: __floordiv__ is deprecated, and its behavior will change in a future version of pytorch. It currently rounds toward 0 (like the 'trunc' function NOT 'floor'). This results in incorrect rounding for negative values. To keep the current behavior, use torch.div(a, b, rounding_mode='trunc'), or for actual floor division, use torch.div(a, b, rounding_mode='floor'). scale = (n_state // self.n_head) ** -0.25 Converting PyTorch Frontend ==> MIL Ops: 100%|▉| 1971/1972 [00:00<00:00, 3247.25 Running MIL frontend_pytorch pipeline: 100%|█| 5/5 [00:00<00:00, 54.69 passes/s] Running MIL default pipeline: 100%|████████| 57/57 [00:09<00:00, 6.29 passes/s] Running MIL backend_mlprogram pipeline: 100%|█| 10/10 [00:00<00:00, 444.13 passe

人工智能AI图像风格迁移(StyleTransfer),基于双层ControlNet(Python3.10)

图像风格迁移(Style Transfer)是一种计算机视觉技术,旨在将一幅图像的风格应用到另一幅图像上,从而生成一幅新图像,该新图像结合了两幅原始图像的特点,目的是达到一种风格化叠加的效果,本次我们使用Stable-Diffusion结合ControlNet来实现图像风格迁移效果。安装ControlNet插件首先确保本地已经安装并且配置好了Stable-Diffusion-Webui服务,关于Stable-Diffusion-Webui,请参见:人工智能,丹青圣手,全平台(原生/Docker)构建Stable-Diffusion-Webui的AI绘画库教程(Python3.10/Pytorch1.13.0),这里不再赘述。 随后进入项目目录,启动Stable-Diffusion-Webui服务:python3 launch.py如果是没有N卡的电脑,就使用cpu模式启动:python3 launch.py --skip-torch-cuda-test --upcast-sampling --use-cpu interrogate接着访问 http://localhost:7860 选择插件(Extensions)选项卡点击从url安装,输入插件地址:github.com/Mikubill/sd-webui-controlnet.git 安装成功后,重启WebUI界面。 由于ControlNet默认是一层网络,风格化操作我们需要两层,所以在设置选单(Settings)中,将多层网络设置为2。 设置好之后,下载模型文件:huggingface.co/webui/ControlNet-modules-safetensors/tree/main 将模型放入 stable-diffusion-webui/extensions/sd-webui-controlnet/models目录 这里还需要单独下载一个风格迁移模型,地址是:huggingface.co/TencentARC/T2I-Adapter/blob/main/models/t2iadapter\_style\_sd14v1.pth 同样放入stable-diffusion-webui/extensions/sd-webui-controlnet/models目录 至此,Stable-Diffusion-Webui服务的ControlNet插件就配置好了。风格迁移现在,我们打开ControlNet的第一个图层,将原始图像的轮廓渲染出来,因为需要保证原始图像的基本形状。这里预处理器选择head,模型使用ControlNet的head模型即可。 可以看到基本轮廓已经得到了保留,风格化只负责颜色和线条。 随后配置第二个ControlNet图层,预处理器选择t2ia\_style-clipvison,模型选择刚刚下载的t2iadapter\_style\_sd14v1.pth,默认图像权重为1,先不要动。 接着上传一张目标风格的图片,这里我们选择文森特梵高的表现主义作品《星空》: 随后点击Generate按钮做图生图(img2img)操作即可。过拟合问题(Overfitting)经过一段时间的本地推理,生成结果如下: 效果并不尽如人意,这也是大多数深度学习入门者会遇到的问题,也就是过拟合问题。 过拟合(Overfitting)是指在训练模型时,模型过度地学习了训练数据的特征和噪声,从而导致模型在新数据上表现不佳的问题。通俗地讲,过拟合就像是一名学生背诵考试答案,但是他只是死记硬背了考试题目的答案,没有真正理解题目的本质和解题思路。当他遇到新的考试题目时,由于没有理解题目的本质和解题思路,他就无法正确回答。在机器学习中,过拟合的原因是模型复杂度过高,导致模型对训练数据中的噪声和特征都过度追求,并且忽略了数据背后的本质规律和特征。因此,当模型面对新的数据时,由于没有真正理解数据的本质规律和特征,它就无法正确地对新数据进行预测。说白了,就是对于原始图的特征过分追求,从而淡化了目标图的风格,还记得ControlNet默认权重是1吗?这里我们只需要将权重往下调整,比如调成0.8,再次尝试生成:效果不错,既保留了原始图的大部分细节,又增加了梵高的表现主义风格。 当然了,权重也不能一味地往下调整,否则也会出现欠拟合(Underfitting)问题,整个风格化迁移的过程也可以理解为是一种“调参”的过程。结语通过Stable-Diffusion结合ControlNet插件,我们可以得到一幅新的图像,该图像结合了两幅原始图像的特点,既具有内容图像的内容,又具有风格图像的风格。图像风格迁移也可以应用于其他的领域,比如电影、游戏、虚拟现实和动画创作等等。

任务拆解,悠然自得,自动版本的ChatGPT,AutoGPT自动人工智能AI任务实践(Python3.10)

当我们使用ChatGPT完成某些工作的时候,往往需要多轮对话,比如让ChatGPT分析、翻译、总结一篇网上的文章或者文档,再将总结的结果以文本的形式存储在本地。过程中免不了要和ChatGPT“折冲樽俎”一番,事实上,这个“交涉”的过程也可以自动化,AutoGPT可以帮助我们自动拆解任务,没错,程序能做到的事情,人类绝不亲力亲为。我们唯一需要做的,就是告诉AutoGPT一个任务目标,AutoGPT会自动根据任务目标将任务拆解成一个个的小任务,并且逐个完成,简单且高效。配置AutoGPT先确保本地环境安装好了Python3.10.9。接着运行Git命令拉取项目:git clone https://github.com/Significant-Gravitas/Auto-GPT.git随后进入项目的目录:cd Auto-GPT安装相关的依赖库:pip3 install -r requirements.txt安装成功后,复制一下项目的配置文件:cp .env.template .env这里通过cp命令将配置文件模版.env.template复制成为一个新的配置文件.env。 随后将Openai的秘钥填入配置文件:### OPENAI # OPENAI_API_KEY - OpenAI API Key (Example: my-openai-api-key) # TEMPERATURE - Sets temperature in OpenAI (Default: 0) # USE_AZURE - Use Azure OpenAI or not (Default: False) OPENAI_API_KEY=您的秘钥 TEMPERATURE=0 USE_AZURE=False除了Openai官方的接口秘钥,AutoGPT也支持微软Azure的接口。 如果希望使用微软Azure的接口,需要将配置中的USE\_AZURE设置为True,随后复制azure.yaml.template配置模版为新的azure.yaml配置文件。 接着将微软Azure服务的秘钥填入azure.yaml即可。 由于微软Azure接入Openai接口需要极其复杂的申请流程,这里还是直接使用OpenAI官方的接口。 当然了,如果不想在本地装那么多依赖,也可以通过Docker来构建Auto-GPT的容器:docker build -t autogpt . docker run -it --env-file=./.env -v $PWD/auto_gpt_workspace:/app/auto_gpt_workspace autogpt这里Docker会自动读取项目中的Dockerfile配置文件进行构建,相当方便。 至此,Auto-GPT就配置好了。运行Auto-GPT在项目根目录运行命令:python3 -m autogpt --debug即可启动AutoGPT:➜ Auto-GPT git:(master) python -m autogpt --debug Warning: The file 'AutoGpt.json' does not exist. Local memory would not be saved to a file. Debug Mode: ENABLED Welcome to Auto-GPT! Enter the name of your AI and its role below. Entering nothing will load defaults. Name your AI: For example, 'Entrepreneur-GPT' AI Name:首先创建AutoGPT机器人的名字:AI Name: v3u.cn v3u.cn here! I am at your service. Describe your AI's role: For example, 'an AI designed to autonomously develop and run businesses with the sole goal of increasing your net worth.' v3u.cn is:创建好名字以后,Auto-GPT就可以随时为您效劳了。 首先为AutoGPT设置目标:v3u.cn is: Analyze the contents of this article,the url is https://v3u.cn/a_id_303,and write the result to goal.txt这里我们要求AutoGPT分析并且总结v3u.cn/a\_id\_303这篇文章,并且将分析结果写入本地的goal.txt文件。 程序返回:Enter up to 5 goals for your AI: For example: Increase net worth, Grow Twitter Account, Develop and manage multiple businesses autonomously' Enter nothing to load defaults, enter nothing when finished. Goal 1: Using memory of type: LocalCacheAutoGPT会告诉你可以最多拆解为五个任务,我们可以自己拆解,也可以让机器人帮助我们拆解,直接按回车,让AutoGPT自动拆解任务即可。 接着程序会自动爬取这篇文章的内容,然后使用gpt-3.5-turbo模型来进行分析:Goal 1: Using memory of type: LocalCache Using Browser: chrome Token limit: 4000 Memory Stats: (0, (0, 1536)) Token limit: 4000 Send Token Count: 936 Tokens remaining for response: 3064 ------------ CONTEXT SENT TO AI --------------- System: The current time and date is Mon Apr 17 20:29:37 2023 System: This reminds you of these events from your past: User: Determine which next command to use, and respond using the format specified above: ----------- END OF CONTEXT ---------------- Creating chat completion with model gpt-3.5-turbo, temperature 0.0, max_tokens 3064 The JSON object is valid. THOUGHTS: Let's start by browsing the article at https://v3u.cn/a_id_303 to get a better understanding of its contents. REASONING: Browsing the article will allow us to analyze its contents and determine the appropriate next steps. PLAN: - Browse the article - Analyze its contents - Determine the appropriate next steps CRITICISM: None NEXT ACTION: COMMAND = browse_website ARGUMENTS = {'url': 'https://v3u.cn/a_id_303', 'question': 'analyze the contents of the article'} Enter 'y' to authorise command, 'y -N' to run N continuous commands, 'n' to exit program, or enter feedback for ... Input:y -=-=-=-=-=-=-= COMMAND AUTHORISED BY USER -=-=-=-=-=-=-= [WDM] - Downloading: 100%|████████████████████████████████████████████| 8.04M/8.04M [00:03<00:00, 2.42MB/s] Text length: 6977 characters Adding chunk 1 / 1 to memory Summarizing chunk 1 / 1 Creating chat completion with model gpt-3.5-turbo, temperature 0.0, max_tokens 300 SYSTEM: Command browse_website returned: Error: This model's maximum context length is 4097 tokens. However, you requested 4339 tokens (4039 in the messages, 300 in the completion). Please reduce the length of the messages or completion. Token limit: 4000 Memory Stats: (2, (2, 1536)) Token limit: 4000 Send Token Count: 1472 Tokens remaining for response: 2528 ------------ CONTEXT SENT TO AI --------------- System: The current time and date is Mon Apr 17 20:30:19 2023 System: This reminds you of these events from your past: ['Assistant Reply: {\n "thoughts": {\n "text": "Let\'s start by browsing the article at https://v3u.cn/a_id_303 to get a better understanding of its contents.",\n "reasoning": "Browsing the article will allow us to analyze its contents and determine the appropriate next steps.",\n "plan": "- Browse the article\\n- Analyze its contents\\n- Determine the appropriate next steps",\n "criticism": "None",\n "speak": "I suggest we start by browsing the article at the given URL to analyze its contents and determine the appropriate next steps."\n },\n "command": {\n "name": "browse_website",\n "args": {\n "url": "https://v3u.cn/a_id_303",\n "question": "analyze the contents of the article"\n }\n }\n} \nResult: Command browse_website returned: Error: This model\'s maximum context length is 4097 tokens. However, you requested 4339 tokens (4039 in the messages, 300 in the completion). Please reduce the length of the messages or completion. \nHuman Feedback: GENERATE NEXT COMMAND JSON '] User: Determine which next command to use, and respond using the format specified above: Assistant: { "thoughts": { "text": "Let's start by browsing the article at https://v3u.cn/a_id_303 to get a better understanding of its contents.", "reasoning": "Browsing the article will allow us to analyze its contents and determine the appropriate next steps.", "plan": "- Browse the article\n- Analyze its contents\n- Determine the appropriate next steps", "criticism": "None", "speak": "I suggest we start by browsing the article at the given URL to analyze its contents and determine the appropriate next steps." "command": { "name": "browse_website", "args": { "url": "https://v3u.cn/a_id_303", "question": "analyze the contents of the article" User: Determine which next command to use, and respond using the format specified above: ----------- END OF CONTEXT ---------------- Creating chat completion with model gpt-3.5-turbo, temperature 0.0, max_tokens 2528最后将分析结果写入goal.txt文件:这篇文章主要阐释了苹果Mac电脑可以完成机器学习和深度学习任务,并且通过深度学习框架Tensorflow的安装和运行进行了佐证,同时也对Tensorflow的CPU和GPU的两种模型训练模式进行了深度对比和测试。一气呵成,流畅丝滑。结语AutoGPT和其他 AI 程序的不同之处在于,它专门专注于在无需人工干预的情况下生成提示和自动执行多步骤任务。它还具有扫描互联网或在用户计算机上执行命令以获取信息的能力,这使其有别于可能仅依赖于预先存在的数据集的其他人工智能程序。 AutoGPT的底层逻辑并不复杂:先通过搜索引擎检索任务,然后把结果和目标丢给gpt让它给出序列化方案json,再把方案分段丢给gpt,最后用shell去创建Python文件+json.load并且执行,是一个反复递归的过程。不能否认的是,虽然实现逻辑简单,但这无疑是一种“自我进化”的过程,相信随着时间的推移,AutoGPT可以更好地处理愈加复杂的任务。

人工智能AI库Spleeter免费人声和背景音乐分离实践(Python3.10)

在视频剪辑工作中,假设我们拿到了一段电影或者电视剧素材,如果直接在剪辑的视频中播放可能会遭遇版权问题,大部分情况需要分离其中的人声和背景音乐,随后替换背景音乐进行二次创作,人工智能AI库Spleeter可以帮我们完成大部分素材的人声和背景音乐的分离流程。 Spleeter的模型源来自最大的音乐网站Deezer,底层基于深度学习框架Tensorflow,它可以通过模型识别出素材中的背景音乐素材,从而判断出哪些是背景音乐,哪些是外部人声。Spleeter安装在终端执行运行pip命令:pip3 install spleeter --user安装成功之后,输入命令,检查Spleeter安装路径:pip show spleeter程序返回:PS C:\Users\liuyue\www\videosite> pip show spleeter WARNING: Ignoring invalid distribution -umpy (c:\python39\lib\site-packages) Name: spleeter Version: 2.3.2 Summary: The Deezer source separation library with pretrained models based on tensorflow. Home-page: https://github.com/deezer/spleeter Author: Deezer Research Author-email: spleeter@deezer.com License: MIT Location: c:\users\liuyue\appdata\roaming\python\python39\site-packages Requires: ffmpeg-python, httpx, librosa, llvmlite, norbert, numpy, pandas, protobuf, tensorflow, typer说明安装成功。 如果不想在本地搭建深度学习环境,也可以通过Docker镜像安装。关于Docker,请移步:一寸宕机一寸血,十万容器十万兵|Win10/Mac系统下基于Kubernetes(k8s)搭建Gunicorn+Flask高可用Web集群,这里不在赘述。 运行Docker命令:docker pull deezer/spleeter:3.8-5stems这里程序加上预训练模型大概需要1.73GB的硬盘空间。Spleeter分离人声和背景音乐Spleeter同时支持视频和音频文件的人声和背景音乐分离,Spleeter自带三种预训练模型:1、人声&伴奏声分离模型 2 stems,分离出两个音轨 2、鼓声、贝斯声及其它声分离模型 4 stems,分离出4个音轨) 3、鼓声、贝斯声、钢琴声及其它声分离模型 5 stems,分离出5个音轨) 后面两种模型相对比较精细,它可以把人声、鼓声、贝斯声、钢琴声各自分离成多个音轨,一般适合音乐行业的专业人士进行使用。大多数情况下,我们只需要使用第一种模型 2 stems 即可,它将音频分离成两个音轨,人声和背景音乐的声音:spleeter separate -o /output/ -p spleeter:2stems /test.mp3这里-o代表输出目录,-p代表选择的分离模型,最后是要分离的素材。 首次运行会比较慢,因为spleeter会下载预训练模型,体积在1.73g左右,运行完毕后,会在输出目录生成分离后的音轨文件:accompaniment.wav vocals.wavaccompaniment.wav代表人声,vocals.wav是背景音乐。 如果是基于Docker安装的,则需要运行对应的Docker命令:docker run -v $(pwd)/output:/output deezer/spleeter:3.8-5stems separate test.mp3 -o /output结语Spleeter可以算是免费的人声和背景音乐分离功能的最佳本地方案了,除了影视剧素材的人声和背景音乐分离的二次创作,如果是在外部环境录制的Vlog,环境音非常嘈杂,而又不得不现场录音,那么使用Spleeter也可以将人声从环境音中分离出来,节省了二次录制画外音的环节。

事实胜于雄辩,苹果MacOs能不能玩儿机器/深度(ml/dl)学习(Python3.10/Tensorflow2)

坊间有传MacOs系统不适合机器(ml)学习和深度(dl)学习,这是板上钉钉的刻板印象,就好像有人说女生不适合编程一样的离谱。现而今,无论是Pytorch框架的MPS模式,还是最新的Tensorflow2框架,都已经可以在M1/M2芯片的Mac系统中毫无桎梏地使用GPU显卡设备,本次我们来分享如何在苹果MacOS系统上安装和配置Tensorflow2框架(CPU/GPU)。Tensorflow2深度学习环境安装和配置首先并不需要任何虚拟环境,直接本地安装Python3.10即可,请参见:一网成擒全端涵盖,在不同架构(Intel x86/Apple m1 silicon)不同开发平台(Win10/Win11/Mac/Ubuntu)上安装配置Python3.10开发环境 ,这里不再赘述。 随后安装Tensorflow本体:pip3 install tensorflow-macos这里系统会自动选择当前Python版本的Tensorflow安装包:➜ ~ pip install tensorflow-macos Collecting tensorflow-macos Downloading tensorflow_macos-2.12.0-cp310-cp310-macosx_12_0_arm64.whl (200.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 200.8/200.8 MB 4.7 MB/s eta 0:00:00安装包大小为200兆左右,如果下载不了,可以选择在pip官网直接下载基于python3.10的安装包:pypi.org/project/tensorflow-macos/#files 然后直接将whl文件拖拽到终端安装即可。接着安装Tensorflow的GPU插件:tensorflow-metal,它是一个TensorFlow的后端,使用苹果的Metal图形API来加速神经网络计算。Metal是一种高性能图形和计算API,专门为苹果设备的GPU设计,可以实现更快的神经网络计算。使用tensorflow-metal可以显著提高在苹果设备上运行TensorFlow的性能,尤其是在使用Macs M1和M2等基于苹果芯片的设备时。pip3 install --user tensorflow-metal注意这里安装命令必须带上--user参数,否则可能会报这个错误: Non-OK-status: stream_executor::MultiPlatformManager::RegisterPlatform( std::move(cplatform)) status: INTERNAL: platform is already registered with name: "METAL" 安装好之后,在Python终端运行命令:import tensorflow tensorflow.config.list_physical_devices()程序返回:>>> import tensorflow >>> tensorflow.config.list_physical_devices() [PhysicalDevice(name='/physical_device:CPU:0', device_type='CPU'), PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]可以看到,Tensorflow用于计算的物理设备既支持CPU,也支持GPU,也就是显卡。 接着,在编写一个完整的测试脚本 test.py:import sys import tensorflow.keras import pandas as pd import sklearn as sk import scipy as sp import tensorflow as tf import platform print(f"Python Platform: {platform.platform()}") print(f"Tensor Flow Version: {tf.__version__}") print(f"Keras Version: {tensorflow.keras.__version__}") print() print(f"Python {sys.version}") print(f"Pandas {pd.__version__}") print(f"Scikit-Learn {sk.__version__}") print(f"SciPy {sp.__version__}") gpu = len(tf.config.list_physical_devices('GPU'))>0 print("GPU is", "available" if gpu else "NOT AVAILABLE")这里打印出深度学习场景下常用的库和版本号:➜ chatgpt_async git:(main) ✗ /opt/homebrew/bin/python3.10 "/Users/liuyue/wodfan/work/chatgpt_async/tensof_test.py" Python Platform: macOS-13.3.1-arm64-arm-64bit Tensor Flow Version: 2.12.0 Keras Version: 2.12.0 Python 3.10.9 (main, Dec 15 2022, 17:11:09) [Clang 14.0.0 (clang-1400.0.29.202)] Pandas 1.5.2 Scikit-Learn 1.2.0 SciPy 1.10.0 GPU is available一望而知,在最新的macOS-13.3.1系统中,基于Python3.10.9玩儿Tensorflow2.1没有任何问题。 至此,Tensorflow2就配置好了。Tensorflow框架GPU和CPU测试为什么一定要让Tensorflow支持GPU?GPU或图形处理单元与CPU类似,同样具有许多核心,允许它们同时进行更快的计算(并行性)。这个特性非常适合执行大规模的数学计算,如计算图像矩阵、计算特征值、行列式等等。简而言之,GPU可以以并行方式运行代码并获得简明的结果,同时由于能够处理高强度的计算,因此可以比CPU更快的获得计算结果。这里我们通过CIFAR-10项目进行测试,TensorFlow CIFAR-10项目是一个经典的计算机视觉项目,旨在训练一个模型,能够对CIFAR-10数据集中的图像进行分类。CIFAR-10数据集包含60,000张32x32像素的彩色图像,分为10个类别,每个类别包含6,000张图像。该项目的目标是训练一个深度神经网络模型,能够对这些图像进行准确的分类:import tensorflow as tf from tensorflow import keras import numpy as np import matplotlib.pyplot as plt (X_train, y_train), (X_test, y_test) = keras.datasets.cifar10.load_data() X_train_scaled = X_train/255 X_test_scaled = X_test/255 # one hot encoding labels y_train_encoded = keras.utils.to_categorical(y_train, num_classes = 10, dtype = 'float32') y_test_encoded = keras.utils.to_categorical(y_test, num_classes = 10, dtype = 'float32') def get_model(): model = keras.Sequential([ keras.layers.Flatten(input_shape=(32,32,3)), keras.layers.Dense(3000, activation='relu'), keras.layers.Dense(1000, activation='relu'), keras.layers.Dense(10, activation='sigmoid') model.compile(optimizer='SGD', loss='categorical_crossentropy', metrics=['accuracy']) return model首先测试CPU性能:%%timeit -n1 -r1 # CPU with tf.device('/CPU:0'): model_cpu = get_model() model_cpu.fit(X_train_scaled, y_train_encoded, epochs = 10)这段代码使用了%%timeit -n1 -r1魔术命令来测试在CPU上训练模型的时间。-n1表示只运行一次,-r1表示只运行一轮。如果没有指定这些参数,则会运行多次并计算平均值。/CPU:0指的是第一个CPU(如果计算机只有一个CPU,则是唯一的CPU)。这里使用get\_model()函数获取模型,使用model\_cpu.fit()方法在CPU上训练模型,使用X\_train\_scaled和y\_train\_encoded作为输入数据,并在10个epoch内进行训练。最后,使用%%timeit命令来测试训练模型所需的时间,以便比较不同设备的性能。程序返回:50000/50000 [==========================] - 80s 2ms/sample 14min 9s需要14分钟。 接着测试GPU性能:%%timeit -n1 -r1 # GPU with tf.device('/GPU:0'): model_gpu = get_model() model_gpu.fit(X_train_scaled, y_train_encoded, epochs = 10)程序返回:50000/50000 [==========================] - 11s 227us/sample 1min 55s一分多钟,很明显在GPU上训练模型比在CPU上训练模型更快,因为GPU可以同时处理多个任务。结语苹果MacOs系统可以承担深度学习任务,但术业有专攻,算力层面还是比不上配置N卡的其他平台,这是不争的事实。没错,更好的选择是RTX3090,甚至是4090,但一块RTX4090显卡的价格是1500刀左右,这还意味着CPU、内存、主板和电源都得单买,而一台m2芯片的Mac book air的价格是多少呢?

成为钢铁侠!只需一块RTX3090,微软开源贾维斯(J.A.R.V.I.S.)人工智能AI助理系统

梦想照进现实,微软果然不愧是微软,开源了贾维斯(J.A.R.V.I.S.)人工智能助理系统,贾维斯(jarvis)全称为Just A Rather Very Intelligent System(只是一个相当聪明的人工智能系统),它可以帮助钢铁侠托尼斯塔克完成各种任务和挑战,包括控制和管理托尼的机甲装备,提供实时情报和数据分析,帮助托尼做出决策等等。如今,我们也可以拥有自己的贾维斯人工智能助理,成本仅仅是一块RTX3090显卡。贾维斯(Jarvis)的环境配置一般情况下,深度学习领域相对主流的入门级别显卡是2070或者3070,而3090可以算是消费级深度学习显卡的天花板了: 再往上走就是工业级别的A系列和V系列显卡,显存是一个硬指标,因为需要加载本地的大模型,虽然可以改代码对模型加载进行“阉割”,但功能上肯定也会有一定的损失。如果没有3090,也可以组两块3060 12G的并行,显存虽然可以达标,但算力和综合性能抵不过3090。 确保本地具备足以支撑贾维斯(Jarvis)的硬件环境之后,老规矩,克隆项目:git clone https://github.com/microsoft/JARVIS.git随后进入项目目录:cd JARVIS修改项目的配置文件 server/config.yaml:openai: key: your_personal_key # gradio, your_personal_key huggingface: cookie: # required for huggingface inference local: # ignore: just for development endpoint: http://localhost:8003 dev: false debug: false log_file: logs/debug.log model: text-davinci-003 # text-davinci-003 use_completion: true inference_mode: hybrid # local, huggingface or hybrid local_deployment: minimal # no, minimal, standard or full num_candidate_models: 5 max_description_length: 100 proxy: httpserver: host: localhost port: 8004 modelserver: host: localhost port: 8005 logit_bias: parse_task: 0.1 choose_model: 5这里主要修改三个配置即可,分别是openaikey,huggingface官网的cookie令牌,以及OpenAI的model,默认使用的模型是text-davinci-003。 修改完成后,官方推荐使用虚拟环境conda,Python版本3.8,私以为这里完全没有任何必要使用虚拟环境,直接上Python3.10即可,接着安装依赖:pip3 install -r requirements.txt项目依赖库如下:git+https://github.com/huggingface/diffusers.git@8c530fc2f6a76a2aefb6b285dce6df1675092ac6#egg=diffusers git+https://github.com/huggingface/transformers@c612628045822f909020f7eb6784c79700813eda#egg=transformers git+https://github.com/patrickvonplaten/controlnet_aux@78efc716868a7f5669c288233d65b471f542ce40#egg=controlnet_aux tiktoken==0.3.3 pydub==0.25.1 espnet==202301 espnet_model_zoo==0.1.7 flask==2.2.3 flask_cors==3.0.10 waitress==2.1.2 datasets==2.11.0 asteroid==0.6.0 speechbrain==0.5.14 timm==0.6.13 typeguard==2.13.3 accelerate==0.18.0 pytesseract==0.3.10 gradio==3.24.1这里web端接口是用Flask2.2高版本搭建的,但奇怪的是微软并未使用Flask新版本的异步特性。 安装完成之后,进入模型目录:cd models下载模型和数据集:sh download.sh这里一定要做好心理准备,因为模型就已经占用海量的硬盘空间了,数据集更是不必多说,所有文件均来自huggingface:models=" nlpconnect/vit-gpt2-image-captioning lllyasviel/ControlNet runwayml/stable-diffusion-v1-5 CompVis/stable-diffusion-v1-4 stabilityai/stable-diffusion-2-1 Salesforce/blip-image-captioning-large damo-vilab/text-to-video-ms-1.7b microsoft/speecht5_asr facebook/maskformer-swin-large-ade microsoft/biogpt facebook/esm2_t12_35M_UR50D microsoft/trocr-base-printed microsoft/trocr-base-handwritten JorisCos/DCCRNet_Libri1Mix_enhsingle_16k espnet/kan-bayashi_ljspeech_vits facebook/detr-resnet-101 microsoft/speecht5_tts microsoft/speecht5_hifigan microsoft/speecht5_vc facebook/timesformer-base-finetuned-k400 runwayml/stable-diffusion-v1-5 superb/wav2vec2-base-superb-ks openai/whisper-base Intel/dpt-large microsoft/beit-base-patch16-224-pt22k-ft22k facebook/detr-resnet-50-panoptic facebook/detr-resnet-50 openai/clip-vit-large-patch14 google/owlvit-base-patch32 microsoft/DialoGPT-medium bert-base-uncased Jean-Baptiste/camembert-ner deepset/roberta-base-squad2 facebook/bart-large-cnn google/tapas-base-finetuned-wtq distilbert-base-uncased-finetuned-sst-2-english mrm8488/t5-base-finetuned-question-generation-ap Jean-Baptiste/camembert-ner t5-base impira/layoutlm-document-qa ydshieh/vit-gpt2-coco-en dandelin/vilt-b32-finetuned-vqa lambdalabs/sd-image-variations-diffusers facebook/timesformer-base-finetuned-k400 facebook/maskformer-swin-base-coco Intel/dpt-hybrid-midas lllyasviel/sd-controlnet-canny lllyasviel/sd-controlnet-depth lllyasviel/sd-controlnet-hed lllyasviel/sd-controlnet-mlsd lllyasviel/sd-controlnet-openpose lllyasviel/sd-controlnet-scribble lllyasviel/sd-controlnet-seg # CURRENT_DIR=$(cd `dirname $0`; pwd) CURRENT_DIR=$(pwd) for model in $models; echo "----- Downloading from https://huggingface.co/"$model" -----" if [ -d "$model" ]; then # cd $model && git reset --hard && git pull && git lfs pull cd $model && git pull && git lfs pull cd $CURRENT_DIR # git clone 包含了lfs git clone https://huggingface.co/$model $model datasets="Matthijs/cmu-arctic-xvectors" for dataset in $datasets; echo "----- Downloading from https://huggingface.co/datasets/"$dataset" -----" if [ -d "$dataset" ]; then cd $dataset && git pull && git lfs pull cd $CURRENT_DIR git clone https://huggingface.co/datasets/$dataset $dataset done也可以考虑拆成两个shell,开多进程下载,速度会快很多。 但事实上,真的,别下了,文件属实过于巨大,这玩意儿真的不是普通人能耍起来的,当然选择不下载本地模型和数据集也能运行,请看下文。 漫长的下载流程结束之后,贾维斯(Jarvis)就配置好了。运行贾维斯(Jarvis)如果您选择下载了所有的模型和数据集(佩服您是条汉子),终端内启动服务:python models_server.py --config config.yaml随后会在系统的8004端口启动一个Flask服务进程,然后发起Http请求即可运行贾维斯(Jarvis):curl --location 'http://localhost:8004/hugginggpt' \ --header 'Content-Type: application/json' \ --data '{ "messages": [ "role": "user", "content": "please generate a video based on \"Spiderman is surfing\"" }'这个的意思是让贾维斯(Jarvis)生成一段“蜘蛛侠在冲浪”的视频。 当然了,以笔者的硬件环境,是不可能跑起来的,所以可以对加载的模型适当“阉割”,在models\_server.py文件的81行左右:other_pipes = { "nlpconnect/vit-gpt2-image-captioning":{ "model": VisionEncoderDecoderModel.from_pretrained(f"{local_fold}/nlpconnect/vit-gpt2-image-captioning"), "feature_extractor": ViTImageProcessor.from_pretrained(f"{local_fold}/nlpconnect/vit-gpt2-image-captioning"), "tokenizer": AutoTokenizer.from_pretrained(f"{local_fold}/nlpconnect/vit-gpt2-image-captioning"), "device": "cuda:0" "Salesforce/blip-image-captioning-large": { "model": BlipForConditionalGeneration.from_pretrained(f"{local_fold}/Salesforce/blip-image-captioning-large"), "processor": BlipProcessor.from_pretrained(f"{local_fold}/Salesforce/blip-image-captioning-large"), "device": "cuda:0" "damo-vilab/text-to-video-ms-1.7b": { "model": DiffusionPipeline.from_pretrained(f"{local_fold}/damo-vilab/text-to-video-ms-1.7b", torch_dtype=torch.float16, variant="fp16"), "device": "cuda:0" "facebook/maskformer-swin-large-ade": { "model": MaskFormerForInstanceSegmentation.from_pretrained(f"{local_fold}/facebook/maskformer-swin-large-ade"), "feature_extractor" : AutoFeatureExtractor.from_pretrained("facebook/maskformer-swin-large-ade"), "device": "cuda:0" "microsoft/trocr-base-printed": { "processor": TrOCRProcessor.from_pretrained(f"{local_fold}/microsoft/trocr-base-printed"), "model": VisionEncoderDecoderModel.from_pretrained(f"{local_fold}/microsoft/trocr-base-printed"), "device": "cuda:0" "microsoft/trocr-base-handwritten": { "processor": TrOCRProcessor.from_pretrained(f"{local_fold}/microsoft/trocr-base-handwritten"), "model": VisionEncoderDecoderModel.from_pretrained(f"{local_fold}/microsoft/trocr-base-handwritten"), "device": "cuda:0" "JorisCos/DCCRNet_Libri1Mix_enhsingle_16k": { "model": BaseModel.from_pretrained("JorisCos/DCCRNet_Libri1Mix_enhsingle_16k"), "device": "cuda:0" "espnet/kan-bayashi_ljspeech_vits": { "model": Text2Speech.from_pretrained(f"espnet/kan-bayashi_ljspeech_vits"), "device": "cuda:0" "lambdalabs/sd-image-variations-diffusers": { "model": DiffusionPipeline.from_pretrained(f"{local_fold}/lambdalabs/sd-image-variations-diffusers"), #torch_dtype=torch.float16 "device": "cuda:0" "CompVis/stable-diffusion-v1-4": { "model": DiffusionPipeline.from_pretrained(f"{local_fold}/CompVis/stable-diffusion-v1-4"), "device": "cuda:0" "stabilityai/stable-diffusion-2-1": { "model": DiffusionPipeline.from_pretrained(f"{local_fold}/stabilityai/stable-diffusion-2-1"), "device": "cuda:0" "runwayml/stable-diffusion-v1-5": { "model": DiffusionPipeline.from_pretrained(f"{local_fold}/runwayml/stable-diffusion-v1-5"), "device": "cuda:0" "microsoft/speecht5_tts":{ "processor": SpeechT5Processor.from_pretrained(f"{local_fold}/microsoft/speecht5_tts"), "model": SpeechT5ForTextToSpeech.from_pretrained(f"{local_fold}/microsoft/speecht5_tts"), "vocoder": SpeechT5HifiGan.from_pretrained(f"{local_fold}/microsoft/speecht5_hifigan"), "embeddings_dataset": load_dataset(f"{local_fold}/Matthijs/cmu-arctic-xvectors", split="validation"), "device": "cuda:0" "speechbrain/mtl-mimic-voicebank": { "model": WaveformEnhancement.from_hparams(source="speechbrain/mtl-mimic-voicebank", savedir="models/mtl-mimic-voicebank"), "device": "cuda:0" "microsoft/speecht5_vc":{ "processor": SpeechT5Processor.from_pretrained(f"{local_fold}/microsoft/speecht5_vc"), "model": SpeechT5ForSpeechToSpeech.from_pretrained(f"{local_fold}/microsoft/speecht5_vc"), "vocoder": SpeechT5HifiGan.from_pretrained(f"{local_fold}/microsoft/speecht5_hifigan"), "embeddings_dataset": load_dataset(f"{local_fold}/Matthijs/cmu-arctic-xvectors", split="validation"), "device": "cuda:0" "julien-c/wine-quality": { "model": joblib.load(cached_download(hf_hub_url("julien-c/wine-quality", "sklearn_model.joblib"))) "facebook/timesformer-base-finetuned-k400": { "processor": AutoImageProcessor.from_pretrained(f"{local_fold}/facebook/timesformer-base-finetuned-k400"), "model": TimesformerForVideoClassification.from_pretrained(f"{local_fold}/facebook/timesformer-base-finetuned-k400"), "device": "cuda:0" "facebook/maskformer-swin-base-coco": { "feature_extractor": MaskFormerFeatureExtractor.from_pretrained(f"{local_fold}/facebook/maskformer-swin-base-coco"), "model": MaskFormerForInstanceSegmentation.from_pretrained(f"{local_fold}/facebook/maskformer-swin-base-coco"), "device": "cuda:0" "Intel/dpt-hybrid-midas": { "model": DPTForDepthEstimation.from_pretrained(f"{local_fold}/Intel/dpt-hybrid-midas", low_cpu_mem_usage=True), "feature_extractor": DPTFeatureExtractor.from_pretrained(f"{local_fold}/Intel/dpt-hybrid-midas"), "device": "cuda:0" }把不需要的模型屏蔽即可。 与此同时,还可以不通过本地服务接口,修改config.yaml的inference\_mode: huggingface随后直接在终端运行ChatGPT版本的贾维斯:cd server python3 awesome_chat.py --config config.yaml --mode cli运行效果:Welcome to Jarvis! A collaborative system that consists of an LLM as the controller and numerous expert models as collaborative executors. Jarvis can plan tasks, schedule Hugging Face models, generate friendly responses based on your requests, and help you with many things. Please enter your request (`exit` to exit). [ User ]: Given a collection of image A: /examples/cat.jpg, B: /examples/z3.jpg, C: /examples/z1.jpg, please tell me how many zebras in these picture? [ Jarvis ]: Based on your request, I have used several models to process the images you provided. For the image A: /examples/cat.jpg, I used the image-to-text model nlpconnect/vit-gpt2-image-captioning to generate the text "a cat sitting on a window sill looking out". Then I used the object-detection model facebook/detr-resnet-50 to generate the image with predicted box '/images/a3dc.jpg', which contains two objects with labels 'potted plant' and 'cat'. For the image B: /examples/z3.jpg, I used the image-to-text model nlpconnect/vit-gpt2-image-captioning to generate the text "two zebras standing in a field of dry grass". Then I used the object-detection model facebook/detr-resnet-50 to generate the image with predicted box '/images/f5df.jpg', which contains three objects with labels 'zebra'. For the image C: /examples/z1.jpg, I used the image-to-text model nlpconnect/vit-gpt2-image-captioning to generate the text "a zebra standing in a field of tall grass". Then I used the object-detection model facebook/detr-resnet-50 to generate the image with predicted box '/images/69c7.jpg', which contains one object with label 'zebra'. Finally, I used the canny-control model canny-control to generate the image '/images/7e18.png' based on the image /examples/savanna.jpg. Therefore, there are four zebras in these pictures. Is there anything else I can help you with? [ User ]: Please answer all the named entities in the sentence: Iron Man is a superhero appearing in American comic books published by Marvel Comics. The character was co-created by writer and editor Stan Lee, developed by scripter Larry Lieber, and designed by artists Don Heck and Jack Kirby. [ Jarvis ]: Sure, I understand your request. Based on the inference results of the models, I will answer your request step by step. Firstly, I used the token-classification model dslim/bert-base-NER to recognize the named entities in the sentence. The inference result is that the named entities are Iron Man, Stan Lee, Larry Lieber, Don Heck and Jack Kirby. Then, I used the question-answering model deepset/roberta-base-squad2 to answer your request. The inference result is that there is no output since the context cannot be empty. Therefore, I cannot make it. I hope this answer your request. Is there anything else I can help you with? [ User ]: Please generate a canny image based on /examples/savanna.jpg [ Jarvis ]: Sure. I have generated a canny image based on /examples/savanna.jpg. To do this, I first used the image-to-text model nlpconnect/vit-gpt2-image-captioning to generate the text description of the image, which is "a herd of giraffes and zebras grazing in a field". Then I used the canny-control model to generate the canny image of the original image, which is stored in the path "/images/1ea2.png". Finally, I used the canny-text-to-image model lllyasviel/sd-controlnet-canny to generate the canny image based on the text description and the canny image, which is stored in the path "/images/ba66.png". Do you need anything else?可以理解为ChatGPT版本的文字生成图片,外面套了个贾维斯(Jarvis)的壳儿,演出效果满分。结语总的来说,和微软之前开源的“可视化图形ChatGPT”一样,此类项目的象征意义要远远大于现实意义。贾维斯(Jarvis)代表的是大多数技术同仁的共同愿景,对于这类人工智能技术的发展,可以肯定,但由于硬件门槛过高的原因,短期内还不能过于期待。

读破万卷,神交古人,突破ChatGPT4096的Token限制,llama_index建立自己的垂直领域资料人工智能助理

ChatGPT的泛用性极高,上知天文,下通地理,参考古今,博稽中外,几乎无所不知,无所不晓。但如果涉及垂直领域的专业知识点,ChatGPT难免也会有语焉不详,闪烁其词的毛病,本次我们将特定领域的学习材料“喂”给ChatGPT,让它“学习”后再来回答专业问题。专业领域语料问题所谓专业领域语料问题,可以理解为特定范围内的知识图谱,也就是给GPT提供前置的检索维度,举个例子,大家都读过鲁迅的名篇《从百草园到三味书屋》,文章中涉及一个“美女蛇”的典故,假设我们没有给GPT设置一个特定范围,直接问“美女蛇”的相关问题: 一望而知,ChatGPT对于“美女蛇”典故的理解出现了信息偏差问题,它以为“美女蛇”指的是《白蛇传》中的白素贞和许仙以及法海的故事。但其实我们都知道,《从百草园到三味书屋》中“美女蛇”指的是人首蛇身的怪物,能唤人名,倘一答应,夜间便要来吃这人的肉的故事。 所以,如果我们想谈论“美女蛇”相关的问题,必须让ChatGPT有一个特定的“语境”,它才能理解真正要谈论的话题,所以需要把《从百草园到三味书屋》作为语料“喂”给ChatGPT才可以,当然了《从百草园到三味书屋》作为人尽皆知的杂文,它肯定默认存储于ChatGPT的语料库中,但假设如果某一个领域的论文或者其他资料,并未出现在ChatGPT的语料库中,而该文章的长度又超过ChatGPT输入的4096个token的限制,那么就非常麻烦了,所以让ChatGPT具备学习“新材料”的能力就显得十分必要了。llama\_index配置语料索引LlamaIndex(GPT Index)是一个针对特定语料检索的GPT项目,可以通过索引文件把外部语料数据和GPT连接起来,首先安装项目:pip3 install llama-index注意该项目依赖langchain模块,为了确保不出问题,最好升级一下langchain模块:pip3 install --upgrade langchainLlamaIndex所做的是将我们的原始语料数据转换成一个基于向量的索引,这对检索来说是非常高效的。它将使用这个索引,根据查询和数据的相似性,找到最相关的部分。然后,它将把检索到的内容插入到它将发送给GPT的引导词(prompt)中,这样GPT就有了回答问题的“语境”: 具体工作流: 将本地答案数据集,转为向量存储到向量数据(index.json)当用户输入查询的问题时,把问题转为向量然后从向量数据库中查询相近的答案topK 这个时候其实就是我们最普遍的问答查询方案,在没有GPT的时候就直接返回相关的答案整个流程就结束了。基于GPT可以优化回答内容的整体结构,在单纯的搜索场景下其实这个优化没什么意义。但如果在垂直领域特定的聊天场景下,引用相关领域内容回复时,数据检索会更加精准。首先把《从百草园到三味书屋》这篇文章写入到项目的data目录中,随后编写代码:import os from llama_index import SimpleDirectoryReader, GPTSimpleVectorIndex,LLMPredictor,ServiceContext from langchain import OpenAI os.environ["OPENAI_API_KEY"] = 'apikey' class LLma: # 建立本地索引 def create_index(self,dir_path="./data"): # 读取data文件夹下的文档 documents = SimpleDirectoryReader(dir_path).load_data() index = GPTSimpleVectorIndex.from_documents(documents) print(documents) # 保存索引 index.save_to_disk('./index.json')这里通过GPTSimpleVectorIndex.from\_documents方法读取data目录中的语料文章,随后转换为向量索引存储在本地磁盘的index.json文件中。执行建立索引方法:if __name__ == '__main__': llma = LLma() # 建立索引 llma.create_index()索引的内容:{"index_struct": {"__type__": "simple_dict", "__data__": {"index_id": "86c83b5a-a975-43ab-8505-cbc8f0ae68e2", "summary": null, "nodes_dict": {"da552579-e0f4-4ee0-be68-a3c392e39dc2": "a2521cfa-13c5-49b2-9cfd-7206fe493666", "c1f7df04-5e6c-4327-a0cc-4a3489d50d19": "68b609e3-2ec5-4de2-ac43-eb28105364ca"}, "doc_id_dict": {"87411099-60d8-4272-a7d1-6e8676fc42a0": ["da552579-e0f4-4ee0-be68-a3c392e39dc2", "c1f7df04-5e6c-4327-a0cc-4a3489d50d19"]}, "embeddings_dict": {"da552579-e0f4-4ee0-be68-a3c392e39dc2": [0.004821529611945152, -0.005787167698144913, 0.00886388961225748, -0.0005273548304103315, -0.0007779211737215519, 0.022242968901991844, -0.0035828494001179934, -0.023534925654530525, -0.03012790158390999, -0.014744291082024574, 0.004718306474387646, -0.0010788505896925926, -0.006236688699573278, 0.0033247910905629396, 0.01692862994968891, 0.02300216071307659, 0.01628931239247322, 0.008157975040376186, 0.028822625055909157, -0.0011337919859215617, 0.006499741692095995, 0.02746407315135002, -0.016302630305290222, -0.002881929511204362, -0.011933951638638973, 0.016502417623996735, 0.03031436912715435, -0.016489099711179733, 0.003935806918889284, -0.0009106963989324868, 0.0039058385882526636, 0.004168891813606024, -0.018566885963082314, -0.00980954896658659, -0.026451818645000458, -0.027490710839629173, -0.008237889967858791, -0.005337646696716547, 0.010009336285293102, 0.0037393493112176657, 0.013931823894381523, 0.0008798958733677864, -0.004105625674128532, 0.011208058334887028, -0.01188733521848917, 0.008311145007610321, -0.020058630034327507, -0.006176752503961325, -0.01582314260303974, 0.026438498869538307, 0.005500806029886007, 0.005507465451955795, -0.028103390708565712, -0.01434471644461155, -0.010175825096666813, 0.011747484095394611, 0.01688867248594761, 0.026558371260762215, -0.010755208320915699, -0.011754143983125687, 0.014970717020332813, 0.017115099355578423, -0.02107088454067707, -0.015317014418542385, -0.020990969613194466, 0.028103390708565712, 0.007438741158694029, -0.040436916053295135, -0.0037460089661180973, -0.0131593132391572, 0.032285600900650024, 0.030554113909602165, 0.005933678243309259, -0.015090588480234146, 0.041422534734010696, -0.01100827194750309, -0.03617479279637337, 0.010488824918866158, -0.010701931081712246, 0.009243485517799854, -0.005277710501104593, -0.01450454629957676, -0.02233620174229145, 0.02344169095158577, 0.01539692934602499, 0.019046373665332794, -0.019206203520298004, 0.04653708636760712, -0.015490163117647171, -0.023175308480858803, 0.012573271058499813, 0.026398541405797005, 0.013179291971027851, 9.521106403553858e-05, 0.018100714311003685, 0.020857777446508408, -0.007119081914424896, 0.013279185630381107, -0.0021826745942234993, -0.0350826233625412, -0.0061501143500208855, -0.009136931970715523, -0.009862825274467468, 0.002357488265261054, -0.023561563342809677, 0.008231230080127716, 0.02282901108264923, 0.00845099613070488, 0.03207249566912651, -0.01539692934602499, -0.007352166809141636, 0.03236551582813263, 0.008677421137690544, -0.04581785202026367, -0.0017514672363176942, -0.026385221630334854, 0.027863647788763046, 0.008018123917281628, -0.016955269500613213, -0.0055873803794384, -0.004668359644711018, 0.01126133557409048, -0.004535167943686247, -0.026385221630334854, -0.0008724038489162922, 0.020697947591543198, -0.011780781671404839, 0.01530369557440281, -0.02426747791469097, -0.013505610637366772, 0.010715250857174397, 0.028662795200943947, 0.017354842275381088, -0.006056880112737417, -0.04019717499613762, 0.0062999543733894825, -0.017195014283061028, -0.003965775016695261, 0.0009897787822410464, -0.02342837303876877, 0.005976965185254812, 0.016822077333927155, -0.012699802406132221, -0.021030927076935768, 0.0061334650963544846, 0.031992580741643906, -9.115288412431255e-05, 0.007465379778295755, -0.011481101624667645, -0.0066828797571361065, 0.02358820289373398, -0.0002526475000195205, 0.03460313379764557, -0.005790497176349163, 0.01036229357123375, 0.013139334507286549, -0.0052244337275624275, 0.011660909280180931, -0.019033055752515793, 0.011667569167912006, 0.024027733132243156, 0.02807675302028656, 0.021803436800837517, -0.011048229411244392, 0.002240945817902684, 0.024760287255048752, 0.004934742581099272, -0.004338710568845272, -0.0006101832259446383, -0.015023993328213692, -0.011694207787513733, 0.013119355775415897, -0.021590329706668854, 0.028263220563530922, 0.003759328043088317, 0.007625209167599678, 0.012107101269066334, 0.015849780291318893, -0.019659055396914482, -0.012972844764590263, -0.04509861767292023, -0.022908926010131836, 0.01881994865834713, 0.01108152698725462, -0.019073013216257095, -0.020458202809095383, -0.012686483561992645, 0.0038159345276653767, -0.0018863235600292683, -0.0006988387904129922, 0.00893048569560051, 0.02617211639881134, -0.019539183005690575, -0.014384673908352852, -0.5932878851890564, -0.011580994352698326, -0.01566331274807453, 0.0027354189660400152, 0.008977102115750313, 0.01149442046880722, 0.006489752326160669, 0.01800748147070408, -0.004032370634377003, -0.04208848997950554, -0.023028798401355743, 0.015050631016492844, 0.005440869834274054, 0.0030251103453338146, -0.01690199226140976, -0.007392124272882938, 0.009263464249670506, -0.011094845831394196, -0.002357488265261054, -0.01977892778813839, -0.008197932504117489, 0.015503481961786747, 0.004951391369104385, 0.007432081736624241, -0.013239228166639805, 0.020951012149453163, 0.012413441203534603, -0.018766671419143677, -0.004964710678905249, 0.034683048725128174, -0.031140156090259552, 0.016196077689528465, 0.013798631727695465, 0.00372935994528234, 0.05791163444519043, -0.005960316397249699, -0.011427824385464191, 0.01466437615454197, 0.027970200404524803, 0.0183271411806345, 0.011274654418230057, -0.00394579628482461, 0.012460058555006981, -0.007025847677141428, -0.0052310931496322155, -0.013918504118919373, 0.025785861536860466, -0.007798358332365751, 0.0028436370193958282, 0.02583913691341877, 0.0004337045829743147, 0.0019362703897058964, -0.00045826175482943654, -0.010415569879114628, 0.023015478625893593, -0.0012678159400820732, 0.018580203875899315, -0.04086313024163246, -0.0015425232704728842, 0.0047849020920693874, 0.016795439645648003, 0.026744838804006577, -0.004138923715800047, -0.010189143940806389, -0.005763859022408724, 0.02454718016088009, -0.04099632054567337, -0.00870405975729227, 0.008690740913152695, -0.0016657252563163638, 0.0038692110683768988, 0.011647590436041355, -0.006389858666807413, 0.0014010072918608785, 0.002547285985201597, 0.022775733843445778, 0.02202986180782318, 0.007245613727718592, -0.007991485297679901, 0.01403837651014328, 0.01580982282757759, -0.003476296318694949, -0.023201946169137955, -0.01768782176077366, 0.010355633683502674, -0.003989083226770163, -0.011387866921722889, 0.0207379050552845, -0.0004757431452162564, -0.009436612948775291, -0.01507726963609457, 0.012506674975156784, -0.004158902447670698, 0.006250008009374142, 0.002142717130482197, 0.008644123561680317, -0.022229649126529694, -0.012100441381335258, 0.009316740557551384, -0.033883899450302124, 0.007305549923330545, -0.04517853260040283, -0.006925954483449459, 0.01164093054831028, -0.0032665198668837547, 0.024853520095348358, -0.014131610281765461, -0.010302357375621796, 0.018233906477689743, -0.0215237345546484, 0.0005473335040733218, 0.0028070092666894197, 0.020697947591543198, -0.0022043180651962757, 0.005677284672856331, -0.019366033375263214, 0.021150799468159676, 0.021643606945872307, 0.03929147124290466, 0.00853757094591856, 0.013132674619555473, 0.02967504970729351, 0.006872677709907293, -0.0004037365142721683, 0.038359131664037704, 0.009429953061044216, -0.02007194794714451, -0.005027976352721453, 0.014211525209248066, 0.00625666743144393, 0.0001508809218648821, -0.014904120936989784, 0.036680918186903, -0.017141737043857574, 0.03657436743378639, -0.017101779580116272, 0.01579650305211544, -0.004821529611945152, 0.02362816035747528, -0.009389995597302914, -0.007771719712764025, 0.016182757914066315, 0.02091105468571186, -0.004601764027029276, -0.009449931792914867, -0.017621226608753204, -0.010542101226747036, -0.014597781002521515, -0.036601003259420395, 0.0007879105396568775, 0.0012811350170522928, -0.007978166453540325, -0.013219249434769154, -0.010974973440170288, -0.01932607591152191, 0.041076235473155975, -0.013618824072182178, -0.0075852517038583755, -0.025439562276005745, 0.01736816205084324, 0.01325254701077938, 0.0026987912133336067, -0.01690199226140976, -0.004954721312969923, 0.004082317464053631, -0.027157733216881752, -0.010661973617970943, -0.005624007899314165, -0.03300483524799347, -0.03396381437778473, -0.010435548610985279, -0.014011737890541553, 0.0018430363852530718, 0.00505794445052743, -0.005011327564716339, -0.015383610501885414, -0.02697126381099224, -0.000325278437230736, -0.019419310614466667, -0.003979093860834837, 0.014211525209248066, 0.0023175308015197515, -0.003479626029729843, -0.0031166793778538704, 0.007725102826952934, -0.02057807520031929, 0.011181420646607876, 0.02759726345539093, 0.01992543786764145, 0.021124159917235374, 0.01269314344972372, 0.02280237339437008, -0.013705397956073284, 0.01612948253750801, -0.012706462293863297, -0.00011789522250182927, 0.030074624344706535, 0.013439015485346317, -0.008457656018435955, 0.03785300254821777, 0.003639455884695053, 0.02488015964627266, 0.0007954025641083717, -0.021590329706668854, 0.011967250145971775, -0.023881223052740097, 0.014850844629108906, -0.016848715022206306, 0.012420101091265678, -0.005164497531950474, -0.001505063148215413, -0.005430880468338728, -0.012346845120191574, -0.02011190541088581, 0.0077783796004951, 0.017168374732136726, -0.012619887478649616, 0.00326818460598588, 0.011953930370509624, -0.002958514727652073, -0.013998419046401978, -0.0045451573096215725, 0.010182484984397888, -0.0003080051683355123, -0.01852692849934101, 0.003238216508179903, 0.0018846587045118213, 0.0013976775808259845, -0.012333526276051998, 0.0011654250556603074, -0.017994161695241928, -0.01493075955659151, 0.006386529188603163, 0.013299164362251759, -0.005633997265249491, 0.01579650305211544, 0.032605260610580444, -0.04030372574925423, 0.0295152198523283, -0.014397993683815002, 0.0036094877868890762, 0.02614547684788704, 0.005800486542284489, -0.02091105468571186, 0.007598571013659239, 0.023827945813536644, 0.007831656374037266, -0.006163433194160461, -0.0167554821819067, 0.024187562987208366, -0.012180356308817863, 0.014864163473248482, -0.018633481115102768, -0.012426760047674179, 0.013339121825993061, -0.046350616961717606, -0.0049214232712984085, 0.022922245785593987, 0.02677147649228573, 0.013359100557863712, 0.003148312447592616, -0.007312209345400333, -0.001261988771148026, 0.007538634818047285, 0.034070368856191635, 0.006126805674284697, -0.0043953172862529755, -0.020618032664060593, -0.02153705433011055, 0.006383199244737625, 0.002515653148293495, -0.020671309903264046, 0.007438741158694029, -0.01085510104894638, 0.017954204231500626, -0.008650783449411392, -0.015103908255696297, -0.009736292995512486, -0.008624144829809666, -0.007771719712764025, -0.030527476221323013, -0.013079398311674595, 0.007378804963082075, 0.007365486118942499, -0.00893714465200901, -0.0028602860402315855, -0.0199387576431036, -0.005534104071557522, -0.00011040320532629266, 0.030900411307811737, -0.014531184919178486, -0.0036061578430235386, 0.007025847677141428, 0.005171157419681549, -0.010568739846348763, 0.00614345446228981, 0.008351102471351624, -0.006686209701001644, -0.015130545943975449, 0.005763859022408724, -0.011607632972300053, -0.015876417979598045, -0.016555694863200188, 0.008231230080127716, 0.024360712617635727, 0.016022928059101105, -0.02007194794714451, 0.0009631405118852854, -0.0019928766414523125, -0.0016507413238286972, -0.03409700468182564, -0.01660897210240364, -0.014171567745506763, -0.011594314128160477, -0.00028844267944805324, 0.01628931239247322, -0.012713122181594372, -0.02262922376394272, 0.041742194443941116, 0.00034400849835947156, -0.006789433304220438, -0.025253094732761383, -0.0335376001894474, 0.01283299457281828, 0.08353766053915024, -0.001618275884538889, -0.012972844764590263, 0.003762657754123211, 0.034523218870162964, 0.006632933393120766, -0.025745904073119164, -0.03718704730272293, 0.022456074133515358, 0.0007313041714951396, 0.007198996841907501, -0.006799422670155764, 0.014397993683815002, -0.004981359466910362, 0.004049019422382116, -0.01165425032377243, -0.006802752148360014, -0.014810887165367603, 0.0077783796004951, 0.0004247557953931391, -0.0016623955452814698, 0.010468846186995506, -0.00013964288518764079, 0.032631900161504745, -0.017821013927459717, 0.0022592595778405666, 0.02378798834979534, 0.015849780291318893, 0.03494942933320999, -0.017314886674284935, -0.014850844629108906, -0.017181694507598877, 0.006523050367832184, 0.004262125585228205, -0.013945142738521099, -0.006729497108608484, -0.00812467746436596, 0.01820726878941059, -0.004045689478516579, -0.011993887834250927, 0.02677147649228573, 0.044113002717494965, 0.026358583942055702, -0.04102296009659767, 0.016835397109389305, -0.005727231502532959, -0.0067494758404791355, 0.0032765092328190804, -0.017900928854942322, -0.007172358222305775, 0.035002708435058594, -0.01347897294908762, 0.011028250679373741, -0.020325012505054474, 0.009683016687631607, -0.006852698978036642, 0.005334316752851009, -0.00701918825507164, -0.0030833815690129995, 0.009456591680645943, 0.004468572326004505, -0.028529604896903038, 0.0035695303231477737, -0.010555421002209187, -0.018074076622724533, -0.02075122483074665, 0.014637738466262817, -0.0065896459855139256, -0.006609624717384577, -0.003095035906881094, 0.001224528648890555, 0.0011879010125994682, -0.028156667947769165, -0.03092704899609089, 0.016995226964354515, 0.014051695354282856, 0.0302877314388752, 0.006100167520344257, -0.01451786607503891, 0.008490953594446182, -0.00028240744723007083, -0.02300216071307659, 0.01491743978112936, -0.018580203875899315, -0.015277056954801083, 0.025146542116999626, 0.0167288426309824, -0.003922487609088421, -0.0006946765352040529, 0.001029736245982349, 0.020657990127801895, 0.006423156708478928, 0.014557823538780212, 0.005943667609244585, 0.019885480403900146, -0.019099650904536247, -0.012912909500300884, 0.01931275799870491, 0.00634657172486186, -0.016302630305290222, -0.0021543714683502913, -0.014477908611297607, -0.007578592281788588, -0.013019462116062641, 0.02409433014690876, -0.01786097139120102, 0.0023724723141640425, 0.0010230767074972391, -0.011414505541324615, 0.005157838109880686, 0.014544503763318062, -0.018447013571858406, 0.009842846542596817, -0.0019346055341884494, 0.002054477808997035, 0.021776799112558365, -0.005254401825368404, 0.035668663680553436, 0.0034463282208889723, -0.017674501985311508, 0.02422752045094967, -0.018220586702227592, 0.014904120936989784, 0.021350586786866188, -0.03537564352154732, 0.029089007526636124, -0.01036895252764225, 0.003919157665222883, -0.015050631016492844, -0.0009240155341103673, 0.018606843426823616, 0.02458713762462139, 0.008197932504117489, 0.00041310154483653605, -0.03684074804186821, -0.01362548302859068, 0.005011327564716339, -0.004335381090641022, -0.006353231146931648, 0.007705124095082283, -0.03207249566912651, -0.016236035153269768, -0.0016457466408610344, -0.012460058555006981, 0.04067666083574295, -0.02550615929067135, -0.015250418335199356, -0.01189399417489767, 0.02200322411954403, 0.00554076349362731, -0.006040231324732304, -0.004641721490770578, -0.0036827430594712496, 0.023494968190789223, 0.009037038311362267, -0.01818062923848629, -0.014424631372094154, -0.007838315330445766, 0.015996290370821953, -0.018873225897550583, 0.03223232552409172, -0.01977892778813839, 0.011933951638638973, 0.013212589547038078, -0.009889463894069195, 0.00219432869926095, -0.026052244007587433, -0.00630328431725502, -0.037080492824316025, 0.022722458466887474, 0.008957123383879662, 0.023401733487844467, -0.00489811459556222, -0.01036229357123375, -0.012899589724838734, 0.04368678852915764, -0.010002676397562027, 0.014957397244870663, -0.029115647077560425, -0.04528508707880974, -0.008477634750306606, -0.013279185630381107, -0.00350959412753582, 0.00916357059031725, 0.012453398667275906, -0.019738970324397087, 0.014318078756332397, -0.011541036888957024, 0.019818885251879692, -0.04096968472003937, 0.03156637027859688, -0.03950457647442818, 0.03652108833193779, -0.01362548302859068, 0.006879337131977081, -0.006253337487578392, 0.0039025088772177696, -0.013865227811038494, -0.010195803828537464, 0.014810887165367603, 0.015183823183178902, -0.008570868521928787, -0.004581785295158625, -0.016688885167241096, 0.006686209701001644, -0.0007421259651891887, 0.00630328431725502, -0.008277847431600094, 0.012866292148828506, -0.014411312527954578, -0.009356698021292686, -0.014251482672989368, -0.0065396991558372974, -0.033883899450302124, 0.009609761647880077, -0.00012236963084433228, -0.016688885167241096, 0.02791692316532135, -0.005241082515567541, -0.011234696954488754, 0.0043154023587703705, 0.007292230613529682, 0.04765589162707329, -0.0008665767381899059, 0.018087396398186684, 0.03095368854701519, -0.030873773619532585, -0.014437951147556305, -0.02839641273021698, 0.025412924587726593, -0.005134529434144497, 0.02217637374997139, 0.032312240451574326, -0.011754143983125687, 0.003321461146697402, 0.02585245668888092, 0.00653636921197176, -0.030261091887950897, -0.04062338545918465, 0.0231353510171175, 0.027037860825657845, -0.004771582782268524, -0.00990278273820877, -0.01770114153623581, 0.0027304242830723524, 0.007771719712764025, -0.010582058690488338, -0.0053975824266672134, -0.052717167884111404, -0.006389858666807413, -0.018060756847262383, -0.0020661321468651295, -0.01022244244813919, 0.01786097139120102, 0.007099103182554245, -0.015516801737248898, -0.020671309903264046, -0.023894542828202248, -0.016222715377807617, 0.027037860825657845, 0.0029734985437244177, 0.05205120891332626, -0.002831982681527734, -0.009569804184138775, 0.013059419579803944, 0.010595378465950489, -0.010841782204806805, -0.023641478270292282, -0.028103390708565712, 0.05466176196932793, -0.0009573134011588991, 0.00041539076482877135, -0.026758158579468727, 0.028636157512664795, 0.017581269145011902, 0.010628676041960716, 0.01720833219587803, -0.005707252770662308, 0.002517318120226264, -0.0021510415244847536, 0.013785312883555889, 0.038865260779857635, -0.002052812837064266, -0.002873605117201805, -0.017794374376535416, -0.00957646407186985, -0.0023641479201614857, 0.0070591457188129425, 0.013538909144699574, -0.0006697031785733998, -0.0087706558406353, -0.003339775139465928, -0.014051695354282856, 0.013265866786241531, -0.01576986536383629, -0.01163427159190178, -0.02220301143825054, 0.03002134896814823, -0.04288763925433159, 0.009057017043232918, -0.0036660940386354923, -0.004205519333481789, -0.0159430131316185, -0.005670625250786543, 0.022935563698410988, -0.015210460871458054, 0.006509731058031321, -0.017514672130346298, -0.020031990483403206, -0.035801857709884644, 0.012586589902639389, 0.007944868877530098, 0.003982423804700375, -0.016502417623996735, 0.05324993282556534, -0.0020128553733229637, -0.0006543028866872191, -0.018593523651361465, -0.02677147649228573, 0.0036527749616652727, -0.00494140200316906, 0.0023058766964823008, -0.013665440492331982, -0.03063402883708477, 0.024800244718790054, -0.007465379778295755, -0.009716315194964409, 0.004335381090641022, -0.047149766236543655, -0.013512270525097847, -0.006250008009374142, 0.013998419046401978, -0.010622016154229641, -0.03159300610423088, -0.028263220563530922, -0.013745355419814587, -0.02314867079257965, 0.013598845340311527, -0.01253331359475851, 0.00821791123598814, 0.03148645535111427, 0.029595134779810905, -0.01644914224743843, -0.009516527876257896, 0.006356561090797186, 0.0039025088772177696, -0.0036827430594712496, -0.017301566898822784, -0.05940337851643562, -0.019086331129074097, -0.024027733132243156, 0.03465640917420387, -0.0015058956341817975, -0.003699391847476363, -0.02422752045094967, 0.010721909813582897, -0.01142116542905569, -0.019579140469431877, 0.006353231146931648, 0.005917029455304146, 0.015503481961786747, -0.02394781820476055, 0.015317014418542385, 0.0059037101455032825, -0.015343653038144112, 0.015903057530522346, -0.02297552116215229, -0.00021196166926529258, -0.015290375798940659, -0.01866011880338192, 0.008411038666963577, 0.011294633150100708, -0.041236065328121185, -0.013265866786241531, 0.002547285985201597, -0.009556485339999199, 0.0018330470193177462, 0.010428888723254204, -0.021936628967523575, -0.0010730234207585454, -0.006289965007454157, -0.007844975218176842, -0.010428888723254204, -0.010229101404547691, -0.001442629611119628, -0.0191262885928154, -0.02730424329638481, 0.0004973867326043546, 0.003506264416500926, -0.024826882407069206, -0.003839242970570922, 0.011381207965314388, -0.009916101582348347, 0.012773058377206326, 0.007498677354305983, -0.026598328724503517, 0.0016207732260227203, 0.006283305585384369, 0.020791182294487953, -0.012093781493604183, -0.02041824534535408, 0.0155567592009902, -0.013052759692072868, -0.02614547684788704, -0.03201922029256821, 9.323399717686698e-05, -0.026997903361916542, 0.016995226964354515, -0.0034629772417247295, -0.018740033730864525, 0.03937138617038727, -0.034842878580093384, 0.027863647788763046, -0.02252267114818096, -0.006556347943842411, -0.006999209523200989, -0.0035262431483715773, -0.026385221630334854, 0.03590840846300125, 0.042701173573732376, -0.004525178577750921, 0.006010263226926327, -0.012147058732807636, -0.015370290726423264, -0.0018064087489619851, -0.0287160724401474, -0.016009610146284103, -0.01752799190580845, -0.0037693174090236425, 0.004391987342387438, 0.01852692849934101, -0.020205140113830566, -0.028529604896903038, 0.018740033730864525, -0.02204318158328533, -0.03132662549614906, 0.3066599369049072, -0.004924753215163946, 0.007798358332365751, 0.017807694151997566, 0.02519981749355793, 0.016022928059101105, 0.026704881340265274, 0.007938208989799023, -0.006130135618150234, 0.01720833219587803, -0.041422534734010696, 0.00011425327102188021, -0.011840717867016792, 0.003399711335077882, 0.002635525306686759, -0.04038364067673683, -0.032285600900650024, -0.011714186519384384, -0.009236825630068779, -0.050959039479494095, 0.018074076622724533, -0.010422229766845703, -0.014371355064213276, -0.028183305636048317, 0.013838589191436768, 0.0017980842385441065, -0.00925680436193943, -0.019112970679998398, 0.022868968546390533, -0.008078060112893581, -0.013272525742650032, -0.003459647297859192, -0.0039957426488399506, -0.01276639848947525, -0.017967524006962776, 0.004728295840322971, 0.03273845463991165, 0.015703270211815834, 0.021030927076935768, 0.028023475781083107, 0.030181176960468292, -0.025239774957299232, 0.007159039378166199, 0.0033381101675331593, -0.009263464249670506, 0.026678243651986122, -0.013505610637366772, -0.002660498721525073, 0.0001414118305547163, 0.01897977851331234, -0.010728569701313972, -0.010721909813582897, 0.01756794936954975, 0.04912099987268448, 0.013905185274779797, 0.00015264986723195761, 0.014970717020332813, -0.00027429108740761876, -0.012320207431912422, 0.01204716507345438, -0.01389186643064022, 0.0478956364095211, -0.0013610499445348978, 0.017301566898822784, -0.01996539533138275, 0.006553018465638161, -0.030341006815433502, -0.015370290726423264, 0.0063732098788022995, -0.014211525209248066, -0.005467507988214493, -0.02996807172894478, -0.013219249434769154, 0.0033797326032072306, -0.022056501358747482, -0.012813015840947628, 0.021883351728320122, 0.003706051502376795, 0.037080492824316025, 0.017940886318683624, -0.008031442761421204, -0.002275908598676324, -0.011254675686359406, -0.011614292860031128, -0.012813015840947628, -0.032924920320510864, 0.02983487956225872, -0.01723497174680233, -0.03140654042363167, -0.010974973440170288, 0.006496411748230457, 0.018087396398186684, -0.005051285028457642, -0.01642250269651413, 0.020684629678726196, 0.02185671404004097, 0.01723497174680233, 0.030394284054636955, -0.023015478625893593, -0.0003858389100059867, -0.03255198523402214, -0.0143580362200737, 0.015290375798940659, 0.01022244244813919, -0.0175413116812706, 0.01077518705278635, 0.004774912726134062, -0.0011862361570820212, 0.0024607116356492043, -0.009636400267481804, -0.002936871023848653, -0.03002134896814823, -0.0007238121470436454, -0.00029447791166603565, 0.00948988925665617, -0.0034496579319238663, -0.004292093683034182, -0.0043953172862529755, 0.027357518672943115, -0.003403041046112776, 0.008024783805012703, -0.021417181938886642, -0.012040505185723305, 0.022935563698410988, -0.012093781493604183, -0.026092201471328735, -0.017927566543221474, 0.010928357020020485, 0.023561563342809677, -0.03977096080780029, 0.021883351728320122, -0.04272780939936638, 0.024453945457935333, 0.007785039022564888, -0.009636400267481804, 0.02839641273021698, 0.010328995063900948, -0.011740824207663536, -0.01020246371626854, -0.02378798834979534, 0.0024640413466840982, -0.00466502970084548, 0.0036561046727001667, -0.02040492743253708, 0.025719264522194862, -0.009030378423631191, 0.018313821405172348, -0.02218969166278839, 0.04515189304947853, -0.026105519384145737, -0.032924920320510864, 0.01325254701077938, -0.006406507920473814, -0.012320207431912422, 0.013039440847933292, -0.02518649958074093, -0.03433674946427345, -0.0075719328597188, 0.0070658051408827305, 0.029941434040665627, -0.03383062407374382, -0.008724038489162922, 0.03633462265133858, -0.009776250459253788, -0.006872677709907293, 0.0005144518800079823, -0.16803430020809174, 0.009123613126575947, 0.02697126381099224, -0.0199121180921793, 0.02502666972577572, -0.0018447012407705188, -0.004115615040063858, 0.003922487609088421, -0.009676357731223106, 0.006296624895185232, 0.01977892778813839, 0.002302546752616763, 0.006170093081891537, -0.019738970324397087, -0.023161988705396652, 0.028050115332007408, -0.03415028378367424, -0.011501080356538296, 0.04504534229636192, 0.019419310614466667, -0.006259997375309467, -0.006689539644867182, 0.006925954483449459, -0.0022675839718431234, 0.016675567254424095, 0.023255223408341408, 0.014890802092850208, -0.0077517409808933735, -0.020657990127801895, -0.03223232552409172, -0.007838315330445766, -0.0026704880874603987, -0.0053842635825276375, -0.024627095088362694, 0.0447256825864315, 0.008504272438585758, 0.0029268816579133272, 0.016302630305290222, -0.02342837303876877, 0.009156910702586174, 0.03332449495792389, 0.016009610146284103, 0.007432081736624241, 0.0002559772692620754, -0.025133222341537476, 0.023375095799565315, 0.016382545232772827, 0.0051611680537462234, 0.0046184128150343895, 0.00013454415602609515, -0.006170093081891537, -0.0015799832763150334, 0.025972329080104828, 0.01373203657567501, 0.014890802092850208, 0.013545568101108074, -0.002442397875711322, 0.02298884093761444, -0.0018197279423475266, 0.028423050418496132, 0.004468572326004505, -0.024014415219426155, 0.005673954728990793, 0.008457656018435955, 0.010915037244558334, 0.00257392437197268, -0.0010979968355968595, -0.0011554356897249818, -0.022282926365733147, 0.011367888189852238, -0.010142527520656586, -0.03300483524799347, 0.03417691960930824, 0.008810613304376602, 0.0069659119471907616, 0.014397993683815002, -0.004691667854785919, -0.013139334507286549, 0.0017248289659619331, 0.01975228823721409, -0.012393462471663952, 0.01624935492873192, -0.03622806817293167, -0.014970717020332813, -0.002856956096366048, 0.010548761114478111, 0.018859906122088432, -0.016142800450325012, 0.012246951460838318, -0.0022908926475793123, 0.003909168299287558, -0.024360712617635727, -0.018540246412158012, -0.025239774957299232, -0.004345370456576347, 0.023641478270292282, 0.0017448076978325844, -0.004178881179541349, -0.006086848210543394, -0.022602586075663567, 0.017288247123360634, -0.004325391724705696, -0.0067861033603549, 0.022123096510767937, -0.003942466340959072, -0.0055274441838264465, -0.0004255882522556931, 0.024853520095348358, 0.03745343163609505, -0.008231230080127716, -0.010189143940806389, 0.007731762249022722, 0.027543988078832626, 0.01980556547641754, -0.01475760992616415, 0.011114824563264847, -0.0191529281437397, -0.008923825807869434, 0.025466201826930046, -0.006566337309777737, -0.017994161695241928, 0.023175308480858803, 0.003198259277269244, 0.009709655307233334, -0.008810613304376602, -0.006839379668235779, -0.08795961737632751, 0.0034829559735953808, 0.026904668658971786, 0.018233906477689743, -0.015903057530522346, -0.005890390835702419, -0.006053550634533167, 0.031672921031713486, -0.01267982367426157, 0.023028798401355743, -0.01566331274807453, -0.019898800179362297, 0.0012120420578867197, -0.007525315508246422, 0.006779443938285112, -0.006200061179697514, 0.035668663680553436, -0.017181694507598877, -0.008943804539740086, 0.028209945186972618, -0.005860422737896442, -0.006170093081891537, 0.00024723660317249596, -0.008590847253799438, -0.0019379352452233434, -0.006932613905519247, -0.021244032308459282, 0.020951012149453163, 0.009210187010467052, 0.010189143940806389, 0.022775733843445778, -0.002665493404492736, -0.01434471644461155, -0.014104972593486309, 0.010948335751891136, -0.013359100557863712, -0.015210460871458054, 0.009729634039103985, 0.03092704899609089, -0.03223232552409172, -0.011048229411244392, 0.011361229233443737, 0.021297309547662735, -0.02028505504131317, 0.01608952507376671, -0.0011296297889202833, -0.004608423449099064, 0.03417691960930824, 0.00609683757647872, -0.012786377221345901, -0.027837008237838745, -0.012759738601744175, -0.0013568876311182976, 0.0010080926585942507, 0.04845504090189934, -0.006259997375309467, 0.009176889434456825, 0.026917988434433937, -0.014890802092850208, 0.010875079780817032, -0.009370016865432262, -0.012506674975156784, -0.005917029455304146, -0.01929943822324276, 0.016475779935717583, 0.014491227455437183, 0.019858842715620995, 0.00032111621112562716, 0.013372419402003288, -0.01578318513929844, -0.021017607301473618, -0.0017448076978325844, -0.012253611348569393, 0.01736816205084324, -0.023161988705396652, -0.013265866786241531, -0.007272251881659031, 0.00940331444144249, 0.009370016865432262, -0.03428347408771515, -0.000847430492285639, -0.01720833219587803, 0.0069725713692605495, -0.03209913522005081, 0.02408101037144661, 0.012140398845076561, 0.02246939390897751, -0.002715440234169364, 0.008297826163470745, -0.02233620174229145, -0.026371903717517853, -0.010701931081712246, -0.003452987875789404, -0.02328186109662056, 0.014730972237884998, 0.009276783093810081, 0.00016742579464334995, -0.011767462827265263, 0.003542891936376691, 0.010868420824408531, -0.0013235898222774267, -0.03782636672258377, -0.06483758985996246, -0.0005236088181845844, 0.007318869233131409, -0.0037393493112176657, 0.004421955440193415, -0.013705397956073284, 0.0135722067207098, -0.008457656018435955, -0.021577011793851852, 0.012187016196548939, -0.02679811604321003, 0.0271444134414196, 0.005803816486150026, -0.010648654773831367, -0.01491743978112936, -0.004651710856705904, -0.0008915501530282199, 0.004045689478516579, 0.0025323019362986088, 0.024174245074391365, 0.008983762003481388, 0.014864163473248482, 0.0026421849615871906, -0.0014834195608273149, -0.02983487956225872, 0.01124135684221983, -0.03225896507501602, 0.03209913522005081, -0.01899309828877449, -0.024507222697138786, 0.0034463282208889723, -0.0223495215177536, 0.010275718756020069, 0.009483229368925095, -0.004025710746645927, -0.011048229411244392, 0.02187003195285797, 0.008763995952904224, 0.01724828965961933, 0.05551418662071228, -0.05010661482810974, -0.02139054425060749, 0.024294117465615273, 0.00426878547295928, -0.01690199226140976, 0.015090588480234146, -0.012087122537195683, 0.037266962230205536, 0.03921155631542206, -0.019059693440794945, 0.020631352439522743, 0.02551947720348835, -0.010522122494876385, -0.009669697843492031, 0.013618824072182178, -0.021257352083921432, 0.00701918825507164, 0.002300882013514638, -0.005950327031314373, -0.038225941359996796, 0.02855624258518219, 0.011228037066757679, 0.007105762604624033, -0.023881223052740097, 0.00813133642077446, -0.02618543431162834, -0.020937692373991013, -0.008823932148516178, 0.015117227099835873, -0.03446994349360466, -0.012060483917593956, -0.0063798693008720875, 0.019033055752515793, 0.0319393053650856, -0.009962718933820724, 0.014131610281765461, -0.00048490005428902805, -0.025173179805278778, -0.01708845980465412, 0.023042116314172745, -0.003646115306764841, 0.012686483561992645, -0.023015478625893593, 0.0004711646761279553, 0.019352715462446213, 0.02440067008137703, -0.0012003877200186253, 0.013705397956073284, 0.006569667253643274, 0.022575946524739265, 0.028636157512664795, 0.012013866566121578, -0.008497613482177258, 0.006223369389772415, -0.001107986201532185, 0.016502417623996735, -0.01165425032377243, 0.002802014583721757, 0.00662627350538969, 0.03175283595919609, 0.001671552425250411, 0.010235761292278767, -0.0048348489217460155, -0.016862034797668457, -0.00415224302560091, -0.007245613727718592, -0.025253094732761383, -0.033910539001226425, 0.004029040690511465, 0.015436886809766293, -0.02378798834979534, 0.005037965718656778, 0.022615903988480568, 0.03164628520607948, -0.003809274872764945, 0.010408909991383553, 0.007192336954176426, -0.013012802228331566, -0.026052244007587433, 0.04256797954440117, 0.015583396889269352, 0.010741888545453548, 0.019659055396914482, -0.015023993328213692, -0.0016032918356359005, 0.022282926365733147, 0.01627599261701107, 0.002553945640102029, 0.0021227383986115456, -0.022535989060997963, -0.0041489130817353725, -0.015610035508871078, -0.029568497091531754, -0.006499741692095995, 0.004731625318527222, -0.011214718222618103, 0.013299164362251759, 0.015756545588374138, -0.009789570234715939, 0.01403837651014328, 0.01627599261701107, -0.010668633505702019, 0.014238163828849792, 0.013811951503157616, 0.018566885963082314, 0.012613228522241116, 0.0030617378652095795, -0.004944731947034597, -0.003779306774958968, 0.018247226253151894, -0.02855624258518219, 0.039318110793828964, -0.025412924587726593, -0.051758188754320145, 0.0010222442215308547, -0.01770114153623581, 0.01419820636510849, -0.025119904428720474, 0.015356971882283688, 0.026358583942055702, -0.02060471475124359, 0.03337777033448219, -0.0050013381987810135, -0.03393717482686043, 0.001778105623088777, 0.01868675835430622, -0.010895058512687683, 2.286626295244787e-05, -0.02027173526585102, 0.011840717867016792, 0.023574883118271828, -0.012240292504429817, -0.0032748442608863115, 0.022376159206032753, -0.01643582247197628, 0.001834711991250515, 0.01980556547641754, 0.030554113909602165, -0.008970443159341812, 0.0013851908734068274, -0.006173422560095787, -0.019232843071222305, -0.019872160628437996, 0.028662795200943947, -0.0012578265741467476, -0.015863100066781044, 0.008164634928107262, -0.026105519384145737], "c1f7df04-5e6c-4327-a0cc-4a3489d50d19": [-0.006850742734968662, -0.014677336439490318, 0.005485869012773037, -0.016431232914328575, -0.009540928527712822, 0.019451098516583443, -0.016431232914328575, -0.019160980358719826, -0.03536802902817726, -0.019160980358719826, 0.03251959756016731, 0.016826847568154335, 0.009989292360842228, 0.003117120824754238, 0.01025303639471531, 0.01136075984686613, 0.028853559866547585, 0.009409056045114994, 0.041381385177373886, -0.012626729905605316, 0.012257488444447517, 0.01395204197615385, -0.014796021394431591, -0.00324569595977664, -0.020070895552635193, 0.009929950349032879, 0.03204485774040222, -0.013899292796850204, 0.00922443624585867, -0.004529797937721014, 0.01788182370364666, 0.010549748316407204, -0.01956978254020214, -0.008472766727209091, -0.022009411826729774, -0.02038738876581192, -0.009316746145486832, -0.010279410518705845, 0.01516526285558939, -0.010780523531138897, 0.010648651979863644, 0.00802440196275711, -0.0017505987780168653, 0.004740792792290449, -0.019451098516583443, 0.013226746581494808, -0.012395953759551048, -0.014967454597353935, -0.005555101670324802, 0.031359124928712845, 0.007661754265427589, 0.008327707648277283, -0.015468567609786987, -0.0025451267138123512, -0.005429823417216539, -0.008182648569345474, 0.01875217631459236, 0.04773760959506035, -0.024251233786344528, -0.018382936716079712, 0.010859646834433079, 0.004559469409286976, -0.011248668655753136, -0.0110376738011837, -0.01768401451408863, 0.024000676348805428, -0.0095475222915411, -0.034682296216487885, -0.0009964567143470049, 0.0009791484335437417, 0.0027561215683817863, 0.026097439229488373, -0.006534249987453222, 0.004753980319947004, 0.06361497938632965, -0.004803432151675224, -0.02658536471426487, -0.0058320327661931515, -0.008657386526465416, -0.0009090915555134416, -0.00871672946959734, -0.01558725256472826, -0.0339701883494854, 0.025701822713017464, 0.010081603191792965, 0.017116965726017952, 0.008598044514656067, 0.03906044363975525, -0.02438310533761978, -0.02145555056631565, 0.006165008991956711, 0.01340477354824543, 0.037161488085985184, 0.016721351072192192, 0.010912396013736725, 0.01949065923690796, 0.003969342447817326, 0.011901434510946274, -0.010279410518705845, -0.031781114637851715, -0.014044351875782013, -0.0017044437117874622, -0.012316830456256866, -0.014848770573735237, -0.0014654259430244565, -0.0162466112524271, 0.01540263183414936, -0.019609343260526657, 0.005743019282817841, 0.005202344618737698, 0.01802688278257847, -0.0066826059482991695, -0.01414984930306673, 0.005370480939745903, -0.015679562464356422, -0.015705937519669533, 0.030963510274887085, -0.008222210220992565, -0.017960945144295692, 0.020941250026226044, -0.0187258031219244]}, "text_id_to_doc_id": {"da552579-e0f4-4ee0-be68-a3c392e39dc2": "87411099-60d8-4272-a7d1-6e8676fc42a0", "c1f7df04-5e6c-4327-a0cc-4a3489d50d19": "87411099-60d8-4272-a7d1-6e8676fc42a0"}}}}llama\_index根据向量索引进行语义化查询llama\_index可以在本地匹配向量索引后再构建提示词:class LLma: def query_index(self,prompt,index_path="./index.json"): # 加载索引 local_index = GPTSimpleVectorIndex.load_from_disk(index_path) # 查询索引 res = local_index.query(prompt) print(res)通过GPTSimpleVectorIndex.load\_from\_disk方法将向量索引导入,执行方法:if __name__ == '__main__': llma = LLma() # 建立索引 #llma.create_index() # 查询索引 llma.query_index("讲一下美女蛇的故事")程序返回:美女蛇的故事可以追溯到古庙里的一个读书人。晚间,他在院子里纳凉的时候,突然听到有人在叫他的名字。他四面看时,却见一个美女的脸露在墙头上,向他一笑,隐去了。他很高兴,但竟给那走来夜谈的老和尚识破了机关,说他脸上有些妖气,一定遇见“美女蛇”了。可以看到,“美女蛇”的故事终于是我们想要的“美女蛇”的故事了。llama\_index模型定制化llama\_index默认的答案生成模型方案为text-davinci-002,我们也可以定制化适合自己的模型配置:class LLma: def __init__(self) -> None: self.llm_predictor = LLMPredictor(llm=OpenAI(temperature=0, model_name="text-davinci-003",max_tokens=1800)) self.service_context = ServiceContext.from_defaults(llm_predictor=self.llm_predictor) # 查询本地索引 def query_index(self,prompt,index_path="./index.json"): # 加载索引 local_index = GPTSimpleVectorIndex.load_from_disk(index_path) # 查询索引 res = local_index.query(prompt) print(res) # 建立本地索引 def create_index(self,dir_path="./data"): # 读取data文件夹下的文档 documents = SimpleDirectoryReader(dir_path).load_data() index = GPTSimpleVectorIndex.from_documents(documents,service_context=self.service_context) print(documents) # 保存索引 index.save_to_disk('./index.json') if __name__ == '__main__': llma = LLma()这里通过初始化函数定制self.llm\_predictor属性,生成本地向量索引时,通过service\_context参数进行动态调用即可:index = GPTSimpleVectorIndex.from\_documents(documents,service\_context=self.service\_context) 。结语藉此,我们就可以通过垂直领域语料来“定制化”ChatGPT的回答了,最后奉上项目地址:github.com/zcxey2911/llama\_index\_examples\_python3.10,与君共觞。

好饭不怕晚,Google基于人工智能AI大语言对话模型Bard测试和API调用(Python3.10)

谷歌(Google)作为开源过著名深度学习框架Tensorflow的超级大厂,是人工智能领域一股不可忽视的中坚力量,旗下新产品Bard已经公布测试了一段时间,毁誉参半,很多人把Google的Bard和OpenAI的ChatGPT进行对比,Google Bard在ChatGPT面前似乎有些技不如人。 事实上,Google Bard并非对标ChatGPT的产品,Bard是基于LaMDA模型对话而进行构建的,Bard旨在构建一个对话式的AI系统,使其能够更好地理解人类语言,并且具备进行多轮对话的能力。而GPT的目标是生成自然语言文本。在特征数据层面,Bard使用了像Gmail、Meet等Google社交产品线中的对话数据来进行训练,这些数据已经经过了严格的隐私保护措施。而GPT则是通过大规模的网页爬虫来获取数据,它的训练数据量比LaMDA要大得多。模型结构层面,Bard采用了一种称为“Transformer”的神经网络结构,该结构可以处理长文本并保持信息连贯性。GPT也使用了Transformer结构,但它还采用了一种称为“自回归”的方式,即按照时间步骤一个接一个地生成文本。说白了,在应用层面上,Bard适合开发智能助手、聊天机器人、虚拟客服等应用。而GPT更适用于自然语言生成任务,例如文章撰写、机器翻译等等。Bard对话测试(英文/中文)访问 bard.google.com ,随后输入聊天内容: 一望而知,对话输入的内容只支持英文,并不能输入中文。 但事实上,只要稍微改变一下对话逻辑,也可以让Bard返回中文信息,比如: Bard代码能力私以为Bard的代码能力并不输于ChatGPT: 尤其是对自家的深度学习框架Tensorflow更是如数家珍,生成的代码如下:import tensorflow as tf def load_model(): """Loads the TensorFlow image recognition model.""" model = tf.keras.models.load_model('mobilenet_v2_1.0_224') return model def recognize_objects(image): """Recognizes objects in an image.""" # Convert the image to a tensor. image = tf.image.convert_image_dtype(image, dtype=tf.float32) # Resize the image to 224x224 pixels. image = tf.image.resize(image, [224, 224]) # Normalize the image. image = tf.image.normalize(image, mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]) # Predict the labels of the objects in the image. predictions = model.predict(image) # Return the labels of the objects in the image. return predictions def main(): # Load the image recognition model. model = load_model() # Load an image. image = tf.io.read_file('image.jpg') # Recognize objects in the image. predictions = recognize_objects(image) # Print the labels of the objects in the image. for prediction in predictions: print(prediction) if __name__ == '__main__': main()这里是使用Tensorflow内置小模型mobilenet\_v2\_1.0\_224的智能识图逻辑,简洁而严谨。Bard网络架构在浏览器(B端)架构上,Google极其自信地使用了HTTP2协议的接口进行通信,而没有仿效ChatGPT使用SSE协议: 我们知道ChatGPT使用SSE协议其实是等而下之的次优选择,因为GPT模型在推理上需要时间,所以走的模式是一边推理一边返回的流式模型,关于流式返回,请移玉步至:逐句回答,流式返回,ChatGPT采用的Server-sent events后端实时推送协议Python3.10实现,基于Tornado6.1,这里不再赘述。 而Google的Bard选择一次性返回所有推理数据: 所以推理效率上,Bard要优于ChatGPT,但仅限于免费产品线,截止本文发布,ChatGPT的收费产品gpt3-turbo和gpt4的推理效率都要远远高于其免费产品。Bard的远程接口API调用和免费版本的ChatGPT一样,Bard目前只支持浏览器端(B端)的使用,但也可以通过浏览器保存的Token进行远程调用,首先安装Bard开源库:pip3 install --upgrade GoogleBard随后复制浏览器端的token秘钥: 接着在终端通过Session进行注入:python3 -m Bard --session UggPYghLzQdQTNx1kQiCRzbPBA1qhjC-dndTiIPCk3YPLR5TexmP7OQ7AfUdsfdsf1Q.随后就可以进入终端内的对话场景,使用alt+enter组合键或者esc+enter组合键发送信息即可:➜ work python3 -m Bard --session UggPYghLzQdQTNx1kQiCRzbPBA1qhjC-dndTiIPCk3YPLR5TexmP7OQdfgdfgdfUSg0UQ. Bard - A command-line interface to Google's Bard (https://bard.google.com/) Repo: github.com/acheong08/Bard Enter `alt+enter` or `esc+enter` to send a message. Google Bard: Hi there! How can I help you today?非常方便,主要是速度相当惊艳。结语仅就免费版本所提供的产品力而言,Google Bard和ChatGPT可谓是各有千秋,私以为Google Bard在效率和使用逻辑上要更胜一筹,并不是网上所传言的那么不堪。所谓一枝独秀不是春,百花齐放才是春满园,Google Bard和百度的文心一言,都会对ChatGPT形成压力,让ChatGPT保持光速更新,成为更好的自己。

性能的极致,Rust的加持,Zed-Dev编辑器快速搭建Python3.10开发环境

快就一个字,甚至比以快著称于世的Sublime 4编辑器都快,这就是Zed.dev编辑器。其底层由 Rust 编写,比基于Electron技术微软开源的编辑器VSCode快一倍有余,性能上无出其右,同时支持多人编辑代码。安装和配置Zed.devZed.dev编辑器还在灰度测试阶段,暂时只释出了Mac版本,在Zed.dev官网下载,安装成功后,进入Zed.dev编辑器,使用组合键 Command + , 调出编辑器的配置文件:// Zed settings // For information on how to configure Zed, see the Zed // documentation: https://zed.dev/docs/configuring-zed // To see all of Zed's default settings without changing your // custom settings, run the `open default settings` command // from the command palette or from `Zed` application menu. "theme": "One Dark", }编辑器默认使用暗黑风格的One Dark主题,也可以通过配置theme来选择别的主题,比如"Rosé Pine Moon:"theme": "Rosé Pine Moon",如图所示: 除此之外,我们也可以配置其他的设置,以方便日常的开发:// Zed settings // For information on how to configure Zed, see the Zed // documentation: https://zed.dev/docs/configuring-zed // To see all of Zed's default settings without changing your // custom settings, run the `open default settings` command // from the command palette or from `Zed` application menu. "theme": "Rosé Pine Moon", "soft_wrap": "editor_width", "autosave": "on_focus_change", "tab_size": 4, "buffer_font_size": 15, "language_overrides": { "Python": { "format_on_save": { "external": { "command": "black", "arguments": ["-"] }这里配置了自动保存,缩进空格,自适应断行等等。 Zed.dev默认支持的语言列表:C Elixir JavaScript Markdown Python TypeScript也就是说默认支持上述语言的自动补全,而不需要单独配置: 虽然Zed.dev还不支持插件,但内部集成了系统的终端,直接通过组合键 esc + shift 打开终端即可运行代码: 非常方便,也可以通过组合键 Command + b 来自由收放左侧文件列表菜单栏。 大体上,基本不需要配置什么,就可以直接用Zed.dev来写代码了,即所谓开箱可用。项目共享协作我们可以从协作菜单中添加一个现有的Zed.dev用户作为联系人,从窗口右上角的加号图标进行部署,或者通过组合键command-shift-c,然后单击搜索框右侧的添加图标按钮:随后可以在协作菜单中看到所有在线或者离线联系人。搜索或点击他们将发送一个请求,开始呼叫并与他们共享当前的项目,他们将收到加入呼叫的通知。 这之后所有连入Zed.dev项目的人就可以进行代码联调了,效率上要比Git高出了不少。结语快速轻便,简单清爽,这就是Zed.dev给我们的第一印象,很明显,在桌面编辑器层面,Rust具有及其出挑的优势,它以闪电般的速度处理功能复杂的任务,同时还减少了与内存、边界、空变量、初始化变量或整数溢出相关的错误,下面是Zed.dev的内存占用情况:最后附上邀请码,与君共觞:zed.dev/invites/T7MtltpVii8thwIW

口播神器,基于Edge,微软TTS(text-to-speech)文字转语音免费开源库edge-tts实践(Python3.10)

不能否认,微软Azure在TTS(text-to-speech文字转语音)这个人工智能细分领域的影响力是统治级的,一如ChatGPT在NLP领域的随心所欲,予取予求。君不见几乎所有的抖音营销号口播均采用微软的语音合成技术,其影响力由此可见一斑,仅有的白璧微瑕之处就是价格略高,虽然国内也可以使用科大讯飞语音合成进行平替,但我们只想要最好的那一个,本次我们使用免费的开源库edge-tts来实现文本转语音操作,薅微软edge的羊毛。TTS文本转语音基础使用方式首先安装edge-tts库:pip3 install edge-tts安装成功后,直接在终端运行edge-tts命令:edge-tts显示帮助菜单即代表安装成功:➜ Downloads edge-tts usage: edge-tts [-h] [-t TEXT] [-f FILE] [-v VOICE] [-l] [--rate RATE] [--volume VOLUME] [-O OVERLAPPING] [--write-media WRITE_MEDIA] [--write-subtitles WRITE_SUBTITLES] [--proxy PROXY]随后输入命令:edge-tts --list-voices该命令可以将Edge浏览器中,内置的语言角色列表列出来:Name: af-ZA-AdriNeural Gender: Female Name: af-ZA-WillemNeural Gender: Male Name: am-ET-AmehaNeural Gender: Male Name: am-ET-MekdesNeural Gender: Female Name: ar-AE-FatimaNeural Gender: Female Name: ar-AE-HamdanNeural Gender: Male Name: ar-BH-AliNeural Gender: Male Name: ar-BH-LailaNeural Gender: Female Name: ar-DZ-AminaNeural Gender: Female Name: ar-DZ-IsmaelNeural Gender: Male Name: ar-EG-SalmaNeural Gender: Female Name: ar-EG-ShakirNeural Gender: Male Name: ar-IQ-BasselNeural Gender: Male Name: ar-IQ-RanaNeural Gender: Female Name: ar-JO-SanaNeural Gender: Female Name: ar-JO-TaimNeural Gender: Male Name: ar-KW-FahedNeural Gender: Male Name: ar-KW-NouraNeural Gender: Female Name: ar-LB-LaylaNeural Gender: Female Name: ar-LB-RamiNeural Gender: Male Name: ar-LY-ImanNeural Gender: Female Name: ar-LY-OmarNeural Gender: Male Name: ar-MA-JamalNeural Gender: Male Name: ar-MA-MounaNeural Gender: Female Name: ar-OM-AbdullahNeural Gender: Male Name: ar-OM-AyshaNeural Gender: Female Name: ar-QA-AmalNeural Gender: Female Name: ar-QA-MoazNeural Gender: Male Name: ar-SA-HamedNeural Gender: Male Name: ar-SA-ZariyahNeural Gender: Female Name: ar-SY-AmanyNeural Gender: Female Name: ar-SY-LaithNeural Gender: Male Name: ar-TN-HediNeural Gender: Male Name: ar-TN-ReemNeural Gender: Female Name: ar-YE-MaryamNeural Gender: Female Name: ar-YE-SalehNeural Gender: Male Name: az-AZ-BabekNeural Gender: Male Name: az-AZ-BanuNeural Gender: Female Name: bg-BG-BorislavNeural Gender: Male Name: bg-BG-KalinaNeural Gender: Female Name: bn-BD-NabanitaNeural Gender: Female Name: bn-BD-PradeepNeural Gender: Male Name: bn-IN-BashkarNeural Gender: Male Name: bn-IN-TanishaaNeural Gender: Female Name: bs-BA-GoranNeural Gender: Male Name: bs-BA-VesnaNeural Gender: Female Name: ca-ES-EnricNeural Gender: Male Name: ca-ES-JoanaNeural Gender: Female Name: cs-CZ-AntoninNeural Gender: Male Name: cs-CZ-VlastaNeural Gender: Female Name: cy-GB-AledNeural Gender: Male Name: cy-GB-NiaNeural Gender: Female Name: da-DK-ChristelNeural Gender: Female Name: da-DK-JeppeNeural Gender: Male Name: de-AT-IngridNeural Gender: Female Name: de-AT-JonasNeural Gender: Male Name: de-CH-JanNeural Gender: Male Name: de-CH-LeniNeural Gender: Female Name: de-DE-AmalaNeural Gender: Female Name: de-DE-ConradNeural Gender: Male Name: de-DE-KatjaNeural Gender: Female Name: de-DE-KillianNeural Gender: Male Name: el-GR-AthinaNeural Gender: Female Name: el-GR-NestorasNeural Gender: Male Name: en-AU-NatashaNeural Gender: Female Name: en-AU-WilliamNeural Gender: Male Name: en-CA-ClaraNeural Gender: Female Name: en-CA-LiamNeural Gender: Male Name: en-GB-LibbyNeural Gender: Female Name: en-GB-MaisieNeural Gender: Female Name: en-GB-RyanNeural Gender: Male Name: en-GB-SoniaNeural Gender: Female Name: en-GB-ThomasNeural Gender: Male Name: en-HK-SamNeural Gender: Male Name: en-HK-YanNeural Gender: Female Name: en-IE-ConnorNeural Gender: Male Name: en-IE-EmilyNeural Gender: Female Name: en-IN-NeerjaExpressiveNeural Gender: Female Name: en-IN-NeerjaNeural Gender: Female Name: en-IN-PrabhatNeural Gender: Male Name: en-KE-AsiliaNeural Gender: Female Name: en-KE-ChilembaNeural Gender: Male Name: en-NG-AbeoNeural Gender: Male Name: en-NG-EzinneNeural Gender: Female Name: en-NZ-MitchellNeural Gender: Male Name: en-NZ-MollyNeural Gender: Female Name: en-PH-JamesNeural Gender: Male Name: en-PH-RosaNeural Gender: Female Name: en-SG-LunaNeural Gender: Female Name: en-SG-WayneNeural Gender: Male Name: en-TZ-ElimuNeural Gender: Male Name: en-TZ-ImaniNeural Gender: Female Name: en-US-AnaNeural Gender: Female Name: en-US-AriaNeural Gender: Female Name: en-US-ChristopherNeural Gender: Male Name: en-US-EricNeural Gender: Male Name: en-US-GuyNeural Gender: Male Name: en-US-JennyNeural Gender: Female Name: en-US-MichelleNeural Gender: Female Name: en-US-RogerNeural Gender: Male Name: en-US-SteffanNeural Gender: Male Name: en-ZA-LeahNeural Gender: Female Name: en-ZA-LukeNeural Gender: Male Name: es-AR-ElenaNeural Gender: Female Name: es-AR-TomasNeural Gender: Male Name: es-BO-MarceloNeural Gender: Male Name: es-BO-SofiaNeural Gender: Female Name: es-CL-CatalinaNeural Gender: Female Name: es-CL-LorenzoNeural Gender: Male Name: es-CO-GonzaloNeural Gender: Male Name: es-CO-SalomeNeural Gender: Female Name: es-CR-JuanNeural Gender: Male Name: es-CR-MariaNeural Gender: Female Name: es-CU-BelkysNeural Gender: Female Name: es-CU-ManuelNeural Gender: Male Name: es-DO-EmilioNeural Gender: Male Name: es-DO-RamonaNeural Gender: Female Name: es-EC-AndreaNeural Gender: Female Name: es-EC-LuisNeural Gender: Male Name: es-ES-AlvaroNeural Gender: Male Name: es-ES-ElviraNeural Gender: Female Name: es-ES-ManuelEsCUNeural Gender: Male Name: es-GQ-JavierNeural Gender: Male Name: es-GQ-TeresaNeural Gender: Female Name: es-GT-AndresNeural Gender: Male Name: es-GT-MartaNeural Gender: Female Name: es-HN-CarlosNeural Gender: Male Name: es-HN-KarlaNeural Gender: Female Name: es-MX-DaliaNeural Gender: Female Name: es-MX-JorgeNeural Gender: Male Name: es-MX-LorenzoEsCLNeural Gender: Male Name: es-NI-FedericoNeural Gender: Male Name: es-NI-YolandaNeural Gender: Female Name: es-PA-MargaritaNeural Gender: Female Name: es-PA-RobertoNeural Gender: Male Name: es-PE-AlexNeural Gender: Male Name: es-PE-CamilaNeural Gender: Female Name: es-PR-KarinaNeural Gender: Female Name: es-PR-VictorNeural Gender: Male Name: es-PY-MarioNeural Gender: Male Name: es-PY-TaniaNeural Gender: Female Name: es-SV-LorenaNeural Gender: Female Name: es-SV-RodrigoNeural Gender: Male Name: es-US-AlonsoNeural Gender: Male Name: es-US-PalomaNeural Gender: Female Name: es-UY-MateoNeural Gender: Male Name: es-UY-ValentinaNeural Gender: Female Name: es-VE-PaolaNeural Gender: Female Name: es-VE-SebastianNeural Gender: Male Name: et-EE-AnuNeural Gender: Female Name: et-EE-KertNeural Gender: Male Name: fa-IR-DilaraNeural Gender: Female Name: fa-IR-FaridNeural Gender: Male Name: fi-FI-HarriNeural Gender: Male Name: fi-FI-NooraNeural Gender: Female Name: fil-PH-AngeloNeural Gender: Male Name: fil-PH-BlessicaNeural Gender: Female Name: fr-BE-CharlineNeural Gender: Female Name: fr-BE-GerardNeural Gender: Male Name: fr-CA-AntoineNeural Gender: Male Name: fr-CA-JeanNeural Gender: Male Name: fr-CA-SylvieNeural Gender: Female Name: fr-CH-ArianeNeural Gender: Female Name: fr-CH-FabriceNeural Gender: Male Name: fr-FR-DeniseNeural Gender: Female Name: fr-FR-EloiseNeural Gender: Female Name: fr-FR-HenriNeural Gender: Male Name: ga-IE-ColmNeural Gender: Male Name: ga-IE-OrlaNeural Gender: Female Name: gl-ES-RoiNeural Gender: Male Name: gl-ES-SabelaNeural Gender: Female Name: gu-IN-DhwaniNeural Gender: Female Name: gu-IN-NiranjanNeural Gender: Male Name: he-IL-AvriNeural Gender: Male Name: he-IL-HilaNeural Gender: Female Name: hi-IN-MadhurNeural Gender: Male Name: hi-IN-SwaraNeural Gender: Female Name: hr-HR-GabrijelaNeural Gender: Female Name: hr-HR-SreckoNeural Gender: Male Name: hu-HU-NoemiNeural Gender: Female Name: hu-HU-TamasNeural Gender: Male Name: id-ID-ArdiNeural Gender: Male Name: id-ID-GadisNeural Gender: Female Name: is-IS-GudrunNeural Gender: Female Name: is-IS-GunnarNeural Gender: Male Name: it-IT-DiegoNeural Gender: Male Name: it-IT-ElsaNeural Gender: Female Name: it-IT-IsabellaNeural Gender: Female Name: ja-JP-KeitaNeural Gender: Male Name: ja-JP-NanamiNeural Gender: Female Name: jv-ID-DimasNeural Gender: Male Name: jv-ID-SitiNeural Gender: Female Name: ka-GE-EkaNeural Gender: Female Name: ka-GE-GiorgiNeural Gender: Male Name: kk-KZ-AigulNeural Gender: Female Name: kk-KZ-DauletNeural Gender: Male Name: km-KH-PisethNeural Gender: Male Name: km-KH-SreymomNeural Gender: Female Name: kn-IN-GaganNeural Gender: Male Name: kn-IN-SapnaNeural Gender: Female Name: ko-KR-InJoonNeural Gender: Male Name: ko-KR-SunHiNeural Gender: Female Name: lo-LA-ChanthavongNeural Gender: Male Name: lo-LA-KeomanyNeural Gender: Female Name: lt-LT-LeonasNeural Gender: Male Name: lt-LT-OnaNeural Gender: Female Name: lv-LV-EveritaNeural Gender: Female Name: lv-LV-NilsNeural Gender: Male Name: mk-MK-AleksandarNeural Gender: Male Name: mk-MK-MarijaNeural Gender: Female Name: ml-IN-MidhunNeural Gender: Male Name: ml-IN-SobhanaNeural Gender: Female Name: mn-MN-BataaNeural Gender: Male Name: mn-MN-YesuiNeural Gender: Female Name: mr-IN-AarohiNeural Gender: Female Name: mr-IN-ManoharNeural Gender: Male Name: ms-MY-OsmanNeural Gender: Male Name: ms-MY-YasminNeural Gender: Female Name: mt-MT-GraceNeural Gender: Female Name: mt-MT-JosephNeural Gender: Male Name: my-MM-NilarNeural Gender: Female Name: my-MM-ThihaNeural Gender: Male Name: nb-NO-FinnNeural Gender: Male Name: nb-NO-PernilleNeural Gender: Female Name: ne-NP-HemkalaNeural Gender: Female Name: ne-NP-SagarNeural Gender: Male Name: nl-BE-ArnaudNeural Gender: Male Name: nl-BE-DenaNeural Gender: Female Name: nl-NL-ColetteNeural Gender: Female Name: nl-NL-FennaNeural Gender: Female Name: nl-NL-MaartenNeural Gender: Male Name: pl-PL-MarekNeural Gender: Male Name: pl-PL-ZofiaNeural Gender: Female Name: ps-AF-GulNawazNeural Gender: Male Name: ps-AF-LatifaNeural Gender: Female Name: pt-BR-AntonioNeural Gender: Male Name: pt-BR-FranciscaNeural Gender: Female Name: pt-PT-DuarteNeural Gender: Male Name: pt-PT-RaquelNeural Gender: Female Name: ro-RO-AlinaNeural Gender: Female Name: ro-RO-EmilNeural Gender: Male Name: ru-RU-DmitryNeural Gender: Male Name: ru-RU-SvetlanaNeural Gender: Female Name: si-LK-SameeraNeural Gender: Male Name: si-LK-ThiliniNeural Gender: Female Name: sk-SK-LukasNeural Gender: Male Name: sk-SK-ViktoriaNeural Gender: Female Name: sl-SI-PetraNeural Gender: Female Name: sl-SI-RokNeural Gender: Male Name: so-SO-MuuseNeural Gender: Male Name: so-SO-UbaxNeural Gender: Female Name: sq-AL-AnilaNeural Gender: Female Name: sq-AL-IlirNeural Gender: Male Name: sr-RS-NicholasNeural Gender: Male Name: sr-RS-SophieNeural Gender: Female Name: su-ID-JajangNeural Gender: Male Name: su-ID-TutiNeural Gender: Female Name: sv-SE-MattiasNeural Gender: Male Name: sv-SE-SofieNeural Gender: Female Name: sw-KE-RafikiNeural Gender: Male Name: sw-KE-ZuriNeural Gender: Female Name: sw-TZ-DaudiNeural Gender: Male Name: sw-TZ-RehemaNeural Gender: Female Name: ta-IN-PallaviNeural Gender: Female Name: ta-IN-ValluvarNeural Gender: Male Name: ta-LK-KumarNeural Gender: Male Name: ta-LK-SaranyaNeural Gender: Female Name: ta-MY-KaniNeural Gender: Female Name: ta-MY-SuryaNeural Gender: Male Name: ta-SG-AnbuNeural Gender: Male Name: ta-SG-VenbaNeural Gender: Female Name: te-IN-MohanNeural Gender: Male Name: te-IN-ShrutiNeural Gender: Female Name: th-TH-NiwatNeural Gender: Male Name: th-TH-PremwadeeNeural Gender: Female Name: tr-TR-AhmetNeural Gender: Male Name: tr-TR-EmelNeural Gender: Female Name: uk-UA-OstapNeural Gender: Male Name: uk-UA-PolinaNeural Gender: Female Name: ur-IN-GulNeural Gender: Female Name: ur-IN-SalmanNeural Gender: Male Name: ur-PK-AsadNeural Gender: Male Name: ur-PK-UzmaNeural Gender: Female Name: uz-UZ-MadinaNeural Gender: Female Name: uz-UZ-SardorNeural Gender: Male Name: vi-VN-HoaiMyNeural Gender: Female Name: vi-VN-NamMinhNeural Gender: Male Name: zh-CN-XiaoxiaoNeural Gender: Female Name: zh-CN-XiaoyiNeural Gender: Female Name: zh-CN-YunjianNeural Gender: Male Name: zh-CN-YunxiNeural Gender: Male Name: zh-CN-YunxiaNeural Gender: Male Name: zh-CN-YunyangNeural Gender: Male Name: zh-CN-liaoning-XiaobeiNeural Gender: Female Name: zh-CN-shaanxi-XiaoniNeural Gender: Female Name: zh-HK-HiuGaaiNeural Gender: Female Name: zh-HK-HiuMaanNeural Gender: Female Name: zh-HK-WanLungNeural Gender: Male Name: zh-TW-HsiaoChenNeural Gender: Female Name: zh-TW-HsiaoYuNeural Gender: Female Name: zh-TW-YunJheNeural Gender: Male Name: zu-ZA-ThandoNeural Gender: Female Name: zu-ZA-ThembaNeural Gender: Male一望而知,几乎支持所有主流的通用语,Gender字段为合成语音的性别,Male代表男性,Female代表女性,zh开头的就是中文语音角色,这里以微软的小伊为例子:edge-tts --voice zh-CN-XiaoyiNeural --text "你好啊,我是智能语音助手" --write-media hello_in_cn.mp3该命令含义是通过zh-CN-XiaoyiNeural角色合成语音:"你好啊,我是智能语音助手"的内容,随后将音频流写入hello\_in\_cn.mp3文件。 程序返回:Downloads edge-tts --voice zh-CN-XiaoyiNeural --text "你好啊,我是智能语音助手" --write-media hello_in_cn.mp3 WEBVTT 00:00:00.100 --> 00:00:00.525 00:00:00.525 --> 00:00:00.912 00:00:01.050 --> 00:00:01.238 00:00:01.238 --> 00:00:01.375 00:00:01.387 --> 00:00:01.700 00:00:01.700 --> 00:00:02.050 00:00:02.062 --> 00:00:02.550 助手程序会自动将时间轴和语音文本匹配输出,如此一来,连字幕文件也有了,可谓是一举两得,一箭双雕。 与此同时,我们也可以调整合成语音的语速:edge-tts --rate=-50% --voice zh-CN-XiaoyiNeural --text "你好啊,我是智能语音助手" --write-media hello_in_cn.mp3--rate参数可以通过加号或者减号同步加快或者减慢合成语音的语速。 亦或者,调整合成语音的音量:edge-tts --volume=-50% --voice zh-CN-XiaoyiNeural --text "你好啊,我是智能语音助手" --write-media hello_in_cn.mp3--volume参数可以调整语音的音量。 遗憾的是,和微软Azure官方的语音合成库相比,开源的语音合成库并不支持基于标记语言 (SSML)的语音调优,比如语调、情绪的调整,但这毕竟是免费的,要求也不能太高了。Python脚本语音合成除了通过命令进行语音合成,edge-tts也支持在Python脚本,编辑test.py文件:import asyncio import edge_tts TEXT = "你好哟,我是智能语音助手,小伊" VOICE = "zh-CN-XiaoyiNeural" OUTPUT_FILE = "/Users/liuyue/Downloads/test.mp3" async def _main() -> None: communicate = edge_tts.Communicate(TEXT, VOICE) await communicate.save(OUTPUT_FILE) if __name__ == "__main__": asyncio.run(_main())这里我们直接通过异步模式调用communicate实例的save方法,就可以并发异步生成语音合成的音频文件,非常方便。 也可以通过语音管理库来自动寻找我们需要的语言:import asyncio import random import edge_tts from edge_tts import VoicesManager TEXT = "中文语音测试" OUTPUT_FILE ="china.mp3" async def _main() -> None: voices = await VoicesManager.create() voice = voices.find(Gender="Female", Language="zh") communicate = edge_tts.Communicate(TEXT, random.choice(voice)["Name"]) await communicate.save(OUTPUT_FILE) if __name__ == "__main__": asyncio.run(_main())这里通过内置的VoicesManager库来随机挑选中文语音角色完成语音合成操作。 除此之外,也可以通过脚本将语音流和字母同步进行生成:import asyncio import edge_tts TEXT = "这里是语音流测试" VOICE = "zh-CN-XiaoyiNeural" OUTPUT_FILE = "test.mp3" WEBVTT_FILE = "test.vtt" async def _main() -> None: communicate = edge_tts.Communicate(TEXT, VOICE) submaker = edge_tts.SubMaker() with open(OUTPUT_FILE, "wb") as file: async for chunk in communicate.stream(): if chunk["type"] == "audio": file.write(chunk["data"]) elif chunk["type"] == "WordBoundary": submaker.create_sub((chunk["offset"], chunk["duration"]), chunk["text"]) with open(WEBVTT_FILE, "w", encoding="utf-8") as file: file.write(submaker.generate_subs()) if __name__ == "__main__": asyncio.run(_main())这里异步调用之后,音频会写入test.mp3,而字幕文件则会写入test.vtt。后续则可以通过ffmpeg将生成的字幕文件叠加到视频中即可,请参见:基于Python3(Autosub)以及Ffmpeg配合GoogleTranslation(谷歌翻译)为你的影片实现双语版字幕(逐字稿)结语开源语音合成edge-tts库可以提高语音合成效率,并且极大地降低了语音合成门槛,为自动化视频剪辑铺平了道路,未来结合基于Stable-Diffusion算法的AI绘图框架,人工智能AI一键式绘制、配音、上字幕的一条龙服务指日可待。

人工智能,丹青圣手,全平台(原生/Docker)构建Stable-Diffusion-Webui的AI绘画库教程(Python3.10/Pytorch1.13.0)

世间无限丹青手,遇上AI画不成。最近一段时间,可能所有人类画师都得发出一句“既生瑜,何生亮”的感叹,因为AI 绘画通用算法Stable Diffusion已然超神,无需美术基础,也不用经年累月的刻苦练习,只需要一台电脑,人人都可以是丹青圣手。本次我们全平台构建基于Stable-Diffusion算法的Webui可视化图形界面服务,基于本地模型来进行AI绘画操作。本地安装Stable-Diffusion-Webui如果系统之前安装过Python3.10或者使用过Pytorch深度学习框架,那么推荐直接本地安装Stable-Diffusion-Webui,因为Stable-Diffusion的核心依赖库也是Pytorch。 首先拉取官方的项目:git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui.git随后进入项目的目录:cd stable-diffusion-webui官方文档建议直接在目录中运行shell脚本:./webui.sh但事实上,shell脚本很容易在过程中报错,该项目的核心代码其实是launch.py,所以理论上,我们只需要正常运行launch.py文件即可。 首先确保本机的Python版本号大于等于3.10.9 关于Python3.10的安装,请移玉步至:一网成擒全端涵盖,在不同架构(Intel x86/Apple m1 silicon)不同开发平台(Win10/Win11/Mac/Ubuntu)上安装配置Python3.10开发环境 ,这里不再赘述。另外确保Pytorch的版本号大于等于13.1.0,关于Pytorch,请移步:闻其声而知雅意,M1 Mac基于PyTorch(mps/cpu/cuda)的人工智能AI本地语音识别库Whisper(Python3.10) 随后安装相关的依赖库:pip3 install -r requirements.txt pip3 install -r requirements_versions.txt依赖文件中,有一个库可能会出问题,就是GFPGAN,它是腾讯开源的人脸识别模块,这里推荐使用GFPGAN官方网站(https://github.com/TencentARC/GFPGAN)的安装方式:# Install basicsr - https://github.com/xinntao/BasicSR # We use BasicSR for both training and inference pip install basicsr # Install facexlib - https://github.com/xinntao/facexlib # We use face detection and face restoration helper in the facexlib package pip install facexlib pip install -r requirements.txt python setup.py develop # If you want to enhance the background (non-face) regions with Real-ESRGAN, # you also need to install the realesrgan package pip install realesrgan安装成功后,最好验证一下:➜ ~ python3 Python 3.10.9 (main, Dec 15 2022, 17:11:09) [Clang 14.0.0 (clang-1400.0.29.202)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import gfpgan >>>所有依赖安装成功后,就可以直接运行launch.py文件即可:python3 launch.py程序返回:Python 3.10.9 (main, Dec 15 2022, 17:11:09) [Clang 14.0.0 (clang-1400.0.29.202)] Commit hash: 0cc0ee1bcb4c24a8c9715f66cede06601bfc00c8 Installing requirements for Web UI Launching Web UI with arguments: --upcast-sampling --use-cpu interrogate Warning: caught exception 'Torch not compiled with CUDA enabled', memory monitor disabled No module 'xformers'. Proceeding without it. ============================================================================== You are running torch 1.13.0. The program is tested to work with torch 1.13.1. To reinstall the desired version, run with commandline flag --reinstall-torch. Beware that this will cause a lot of large files to be downloaded, as well as there are reports of issues with training tab on the latest version. Use --skip-version-check commandline argument to disable this check. ============================================================================== Loading weights [6ce0161689] from /Users/liuyue/wodfan/work/stable-diffusion-webui/models/Stable-diffusion/v1-5-pruned-emaonly.safetensors Creating model from config: /Users/liuyue/wodfan/work/stable-diffusion-webui/configs/v1-inference.yaml LatentDiffusion: Running in eps-prediction mode DiffusionWrapper has 859.52 M params. Applying cross attention optimization (InvokeAI). Textual inversion embeddings loaded(0): Model loaded in 8.2s (create model: 0.6s, apply weights to model: 5.0s, apply half(): 1.9s, move model to device: 0.5s). Running on local URL: http://127.0.0.1:7860Stable-Diffusion-Webui服务会运行在系统的7860端口上。 需要注意的是,如果本地系统不支持cuda模式,需要修改运行命令:python3 launch.py --skip-torch-cuda-test --upcast-sampling --use-cpu interrogate这里使用CPU来进行模型训练。 另外如果是M系列的Mac,其实是支持MPS模式的,但Stable Diffusion目前的最新版并不支持MPS,所以需要单独设置环境变量,关闭MPS模式:export PYTORCH_ENABLE_MPS_FALLBACK=1最后访问http://127.0.0.1:7860即可,本地构建Stable-Diffusion-Webui服务就完成了。Docker构建Stable-Diffusion-Webui如果不想太折腾,也可以使用Docker容器来构建Stable-Diffusion-Webui,同样地,需要拉取线上的Docker配置文件项目:git clone https://github.com/AbdBarho/stable-diffusion-webui-docker.git随后进入项目的目录:stable-diffusion-webui-docker接着运行命令下载相关的依赖镜像:docker compose --profile download up --build下载完成后,运行命令构建容器:docker compose --profile auto up --build这里需要注意的是,模型数据和输出文件夹会以/data和/output的形式挂载到容器中,如果想在宿主机往容器内传入模型或者其他图片,需要写入项目中的data目录。 过程中,可能会报错:Found no NVIDIA driver on your system这是因为容器内找不到NVIDIA的显卡驱动。这里需要单独再启动一个容器服务:docker run -ti --runtime=nvidia -e NVIDIA_DRIVER_CAPABILITIES=compute,utility -e NVIDIA_VISIBLE_DEVICES=all allennlp/allennlp总的来说,安装过程简单,但是调试比较费劲,一旦启动出问题,就得进入容器内部修改代码,或者反复修改Dockerfile文件,所以Docker比较适合封装业务改动频繁的容器,而不是依赖环境繁多并且版本需要反复调整的场景。Stable-Diffusion-Webui图像绘制配置好Stable-Diffusion-Webui环境之后,访问http://127.0.0.1:7860: 在Prompt文本框中填入引导词:Tall buildings, people bustling, heavy traffic, staggered light and shadow, the beauty of the city is conspicuous before.随后点击右侧Generate生成按钮即可,这里引导词的意思是:高楼林立,人群熙熙攘攘,车水马龙,光影交错,城市之美尽显眼前。 注意引导词需要使用逗号分隔。 后端开始进行训练:To create a public link, set `share=True` in `launch()`. 100%|██████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:24<00:00, 1.25s/it] Total progress: 100%|██████████████████████████████████████████████████████████████████████████████| 20/20 [00:19<00:00, 1.00it/s] 100%|██████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:34<00:00, 1.72s/it] Total progress: 100%|██████████████████████████████████████████████████████████████████████████████| 20/20 [00:22<00:00, 1.11s/it] 100%|██████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:22<00:00, 1.10s/it] Total progress: 100%|██████████████████████████████████████████████████████████████████████████████| 20/20 [00:20<00:00, 1.00s/it] 100%|██████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:22<00:00, 1.10s/it] Total progress: 100%|██████████████████████████████████████████████████████████████████████████████| 20/20 [00:22<00:00, 1.13s/it] 100%|██████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:22<00:00, 1.12s/it] Total progress: 100%|██████████████████████████████████████████████████████████████████████████████| 20/20 [00:21<00:00, 1.07s/it] 100%|██████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:21<00:00, 1.09s/it] Total progress: 100%|██████████████████████████████████████████████████████████████████████████████| 20/20 [00:19<00:00, 1.03it/s] 100%|██████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:20<00:00, 1.01s/it] Total progress: 100%|██████████████████████████████████████████████████████████████████████████████| 20/20 [00:19<00:00, 1.03it/s] 100%|██████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:20<00:00, 1.01s/it] Total progress: 100%|██████████████████████████████████████████████████████████████████████████████| 20/20 [00:19<00:00, 1.02it/s] 100%|██████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:22<00:00, 1.15s/it] Total progress: 100%|██████████████████████████████████████████████████████████████████████████████| 20/20 [00:21<00:00, 1.07s/it] 100%|██████████████████████████████████████████████████████████████████████████████████████████████| 20/20 [00:21<00:00, 1.06s/it] Total progress: 100%|██████████████████████████████████████████████████████████████████████████████| 20/20 [00:20<00:00, 1.00s/it]片刻之间,挥毫落纸如云烟。 遗憾的是,引导词不支持中文,但可以配置权重,数值从0.1~100,默认状态是1,低于1就是减弱,大于1就是加强: (Tall buildings:1.1), people bustling(1.61),(heavy traffic:0.3),(staggered light and shadow:1.3) Stable-Diffusion-Webui也支持Negative prompt(反向引导词)。就是用文字描述你不想在图像中出现的东西:对图片进行去噪处理,使其看起来更像你的提示词;同样使其看起来更像你的反向提示词。同时观察正方向两者之间的差异,并利用它来产生一组对噪声图片的改变,将最终结果移向前者而远离后者。默认通用反向引导词:lowres,bad anatomy,bad hands,text,error,missing fingers, extra digit,fewer digits,cropped,worst quality, low quality,normal quality,jpeg artifacts,signature, watermark,username,blurry,missing arms,long neck, Humpbacked,missing limb,too many fingers, mutated,poorly drawn,out of frame,bad hands, owres,unclear eyes,poorly drawn,cloned face,bad face除了引导词,还可以调整采样迭代步数 (Sampling Steps)。 系统先随机生成一个基础的图片,然后一步步的调整图片,向引导词 Prompt 靠拢Sampling Steps参数就是告诉人工智能,这样的步骤应该进行多少次。次数越多,每一步训练也就越小越精确。当然了,成本也会越高,同时每一次训练的时间也会成同比增长。除了迭代步数,也可以自由地选择采样方法(Sampling method)也就是让Stable-Diffusion-Webui具体使用用什么算法来训练图片模型。默认算法是Euler a :富有创造力,不同步数可以生产出不同的图片。 但是超过30步左右基本就没有实质化的增益效果。Euler算法:最简单的算法,训练速度也是最快的。LMS算法:Euler的延伸算法,相对更稳定一点,30步就比较稳定了PLMS:优化过的LMS算法 其他的一些参数: 生成批次Batch count/n\_iter:同样的配置,循环跑几次每批数量 Batch size:同时生成多少个图像,增加这个值可以并行运行,但也需要更多的显卡显存。提示词相关性 CFG Scale:图像与引导词匹配程度。增加这个值将导致图像更接近引导词,但过高会让图像色彩过于饱和。一般在5~15之间为好,7,9,12是3个常见的设置值。宽度 X 高度 Width X Height:单位是像素,适当增加尺寸,后台会试图填充更多的细节进来。Stable-Diffusion-Webui定制化模型Stable-Diffusion-Webui默认下载的基础模型在项目目录的models/Stable-diffusion文件夹中:/stable-diffusion-webui/models/Stable-diffusion模型名称是v1-5-pruned-emaonly.safetensors,体积是4.27GB。 如果需要一些比较有个性定制化模型,可以在civitai.com平台进行挑选和下载,需要注意的是,该平台上的模型鱼龙混杂,良莠不齐,不能说是蔚为大观,但多多少少有点泥沙俱下的意思,所以最好不要在比较正式的公共(工作)环境打开该平台,否则结果可能会令人非常尴尬。 这里我们选择相对比较潮流的赛博朋克风格模型:synthwavepunk 将下载的模型放入models/Stable-diffusion目录。 随后重启Stable-Diffusion-Webui服务:python3 launch.py --skip-torch-cuda-test --upcast-sampling --use-cpu interrogate在页面表单中的Stable Diffusion checkpoint选项里选择对应的模型: 引导词:concept art, 4k, intricate, pinup, a woman, beautiful face, embroidery, lace, hyper-realistic, highly detailed, octane render, concept art, smooth, 8k, dancing princess, snthwve style, nvinkpunk, by jeremy mann, by sandra chevrier, by dave mckean and richard avedon and maciej kuciara训练结果:好了,现在,你已经知晓那些网络上的漂亮小姐姐是怎么生成的了。结语也许我们只是偶尔被网络上那些酷炫而猎奇的AI生成图所吸引,但如果你真的动手去搭建、调试、甚至开始训练属于自己的绘画模型,相信我,你马上就会深陷其中,不能自拔,AI仿若可以满足你所有的幻想,欲望满溢,又欲言又止,分寸把握之精确,妙入毫颠。什么?你还在玩那些无聊的电子游戏?相信我,Stable-Diffusion-Webui才是最高级的精神享受,没有之一,也不可替代。

登峰造极,师出造化,Pytorch人工智能AI图像增强框架ControlNet绘画实践,基于Python3.10

人工智能太疯狂,传统劳动力和内容创作平台被AI枪毙,弃尸尘埃。并非空穴来风,也不是危言耸听,人工智能AI图像增强框架ControlNet正在疯狂地改写绘画艺术的发展进程,你问我绘画行业未来的样子?我只好指着ControlNet的方向。本次我们在M1/M2芯片的Mac系统下,体验人工智能登峰造极的绘画艺术。本地安装和配置ControlNetControlNet在HuggingFace训练平台上也有体验版,请参见: https://huggingface.co/spaces/hysts/ControlNet,但由于公共平台算力有限,同时输入参数也受到平台的限制,一次只能训练一张图片,不能让人开怀畅饮。为了能和史上最伟大的图像增强框架ControlNet一亲芳泽,我们选择本地搭建ControlNet环境,首先运行Git命令拉取官方的线上代码:git clone https://github.com/lllyasviel/ControlNet.git拉取成功后,进入项目目录:cd ControlNet由于Github对文件大小有限制,所以ControlNet的训练模型只能单独下载,模型都放在HuggingFace平台上:https://huggingface.co/lllyasviel/ControlNet/tree/main/models,需要注意的是,每个模型的体积都非常巨大,达到了5.71G,令人乍舌。 下载好模型后,需要将其放到ControlNet的models目录中:├── models │ ├── cldm_v15.yaml │ ├── cldm_v21.yaml │ └── control_sd15_canny.pth这里笔者下载了control\_sd15\_canny.pth模型,即放入models目录中,其他模型也是一样。 随后安装运行环境,官方推荐使用conda虚拟环境,安装好conda后,运行命令激活虚拟环境即可:conda env create -f environment.yaml conda activate control但笔者查看了官方的environment.yaml配置文件:name: control channels: - pytorch - defaults dependencies: - python=3.8.5 - pip=20.3 - cudatoolkit=11.3 - pytorch=1.12.1 - torchvision=0.13.1 - numpy=1.23.1 - pip: - gradio==3.16.2 - albumentations==1.3.0 - opencv-contrib-python==4.3.0.36 - imageio==2.9.0 - imageio-ffmpeg==0.4.2 - pytorch-lightning==1.5.0 - omegaconf==2.1.1 - test-tube>=0.7.5 - streamlit==1.12.1 - einops==0.3.0 - transformers==4.19.2 - webdataset==0.2.5 - kornia==0.6 - open_clip_torch==2.0.2 - invisible-watermark>=0.1.5 - streamlit-drawable-canvas==0.8.0 - torchmetrics==0.6.0 - timm==0.6.12 - addict==2.4.0 - yapf==0.32.0 - prettytable==3.6.0 - safetensors==0.2.7 - basicsr==1.4.2一望而知,Python版本是老旧的3.8,Torch版本1.12并不支持Mac独有的Mps训练模式。 同时,Conda环境也有一些缺点: 环境隔离可能会导致一些问题。虽然虚拟环境允许您管理软件包的版本和依赖关系,但有时也可能导致环境冲突和奇怪的错误。Conda环境可以占用大量磁盘空间。每个环境都需要独立的软件包副本和依赖项。如果需要创建多个环境,这可能会导致磁盘空间不足的问题。软件包可用性和兼容性也可能是一个问题。Conda环境可能不包含某些软件包或库,或者可能不支持特定操作系统或硬件架构。在某些情况下,Conda环境的创建和管理可能会变得复杂和耗时。如果需要管理多个环境,并且需要在这些环境之间频繁切换,这可能会变得困难。所以我们也可以用最新版的Python3.10来构建ControlNet训练环境,编写requirements.txt文件:pytorch==1.13.0 gradio==3.16.2 albumentations==1.3.0 opencv-contrib-python==4.3.0.36 imageio==2.9.0 imageio-ffmpeg==0.4.2 pytorch-lightning==1.5.0 omegaconf==2.1.1 test-tube>=0.7.5 streamlit==1.12.1 einops==0.3.0 transformers==4.19.2 webdataset==0.2.5 kornia==0.6 open_clip_torch==2.0.2 invisible-watermark>=0.1.5 streamlit-drawable-canvas==0.8.0 torchmetrics==0.6.0 timm==0.6.12 addict==2.4.0 yapf==0.32.0 prettytable==3.6.0 safetensors==0.2.7 basicsr==1.4.2随后,运行命令:pip3 install -r requirements.txt至此,基于Python3.10来构建ControlNet训练环境就完成了,关于Python3.10的安装,请移玉步至:一网成擒全端涵盖,在不同架构(Intel x86/Apple m1 silicon)不同开发平台(Win10/Win11/Mac/Ubuntu)上安装配置Python3.10开发环境,这里不再赘述。修改训练模式(Cuda/Cpu/Mps)ControlNet的代码中将训练模式写死为Cuda,CUDA是NVIDIA开发的一个并行计算平台和编程模型,因此不支持NVIDIA GPU的系统将无法运行CUDA训练模式。除此之外,其他不支持CUDA训练模式的系统可能包括:没有安装NVIDIA GPU驱动程序的系统没有安装CUDA工具包的系统使用的NVIDIA GPU不支持CUDA(较旧的GPU型号可能不支持CUDA)没有足够的GPU显存来运行CUDA训练模式(尤其是在训练大型深度神经网络时需要大量显存)需要注意的是,即使系统支持CUDA,也需要确保所使用的机器学习框架支持CUDA,否则无法使用CUDA进行训练。我们可以修改代码将训练模式改为Mac支持的Mps,请参见:闻其声而知雅意,M1 Mac基于PyTorch(mps/cpu/cuda)的人工智能AI本地语音识别库Whisper(Python3.10),这里不再赘述。 如果代码运行过程中,报下面的错误:RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.说明当前系统不支持cuda模型,需要修改几个地方,以项目中的gradio\_canny2image.py为例子,需要将gradio\_canny2image.py文件中的cuda替换为cpu,同时修改/ControlNet/ldm/modules/encoders/modules.py文件,将cuda替换为cpu,修改/ControlNet/cldm/ddim\_hacked.py文件,将cuda替换为cpu。至此,训练模式就改成cpu了。开始训练修改完代码后,直接在终端运行gradio\_canny2image.py文件:python3 gradio_canny2image.py程序返回:➜ ControlNet git:(main) ✗ /opt/homebrew/bin/python3.10 "/Users/liuyue/wodfan/work/ControlNet/gradio_cann y2image.py" logging improved. No module 'xformers'. Proceeding without it. /opt/homebrew/lib/python3.10/site-packages/pytorch_lightning/utilities/distributed.py:258: LightningDeprecationWarning: `pytorch_lightning.utilities.distributed.rank_zero_only` has been deprecated in v1.8.1 and will be removed in v2.0.0. You can import it from `pytorch_lightning.utilities` instead. rank_zero_deprecation( ControlLDM: Running in eps-prediction mode DiffusionWrapper has 859.52 M params. making attention of type 'vanilla' with 512 in_channels Working with z of shape (1, 4, 32, 32) = 4096 dimensions. making attention of type 'vanilla' with 512 in_channels Loaded model config from [./models/cldm_v15.yaml] Loaded state_dict from [./models/control_sd15_canny.pth] Running on local URL: http://0.0.0.0:7860 To create a public link, set `share=True` in `launch()`.此时,在本地系统的7860端口上会运行ControlNet的Web客户端服务。 访问 http://localhost:7860,就可以直接上传图片进行训练了。 这里以本站的Logo图片为例子: 通过输入引导词和其他训练参数,就可以对现有图片进行扩散模型的增强处理,这里的引导词的意思是:红宝石、黄金、油画。训练结果可谓是言有尽而意无穷了。 除了主引导词,系统默认会添加一些辅助引导词,比如要求图像品质的best quality, extremely detailed等等,完整代码:from share import * import config import cv2 import einops import gradio as gr import numpy as np import torch import random from pytorch_lightning import seed_everything from annotator.util import resize_image, HWC3 from annotator.canny import CannyDetector from cldm.model import create_model, load_state_dict from cldm.ddim_hacked import DDIMSampler apply_canny = CannyDetector() model = create_model('./models/cldm_v15.yaml').cpu() model.load_state_dict(load_state_dict('./models/control_sd15_canny.pth', location='cpu')) model = model.cpu() ddim_sampler = DDIMSampler(model) def process(input_image, prompt, a_prompt, n_prompt, num_samples, image_resolution, ddim_steps, guess_mode, strength, scale, seed, eta, low_threshold, high_threshold): with torch.no_grad(): img = resize_image(HWC3(input_image), image_resolution) H, W, C = img.shape detected_map = apply_canny(img, low_threshold, high_threshold) detected_map = HWC3(detected_map) control = torch.from_numpy(detected_map.copy()).float().cpu() / 255.0 control = torch.stack([control for _ in range(num_samples)], dim=0) control = einops.rearrange(control, 'b h w c -> b c h w').clone() if seed == -1: seed = random.randint(0, 65535) seed_everything(seed) if config.save_memory: model.low_vram_shift(is_diffusing=False) cond = {"c_concat": [control], "c_crossattn": [model.get_learned_conditioning([prompt + ', ' + a_prompt] * num_samples)]} un_cond = {"c_concat": None if guess_mode else [control], "c_crossattn": [model.get_learned_conditioning([n_prompt] * num_samples)]} shape = (4, H // 8, W // 8) if config.save_memory: model.low_vram_shift(is_diffusing=True) model.control_scales = [strength * (0.825 ** float(12 - i)) for i in range(13)] if guess_mode else ([strength] * 13) # Magic number. IDK why. Perhaps because 0.825**12<0.01 but 0.826**12>0.01 samples, intermediates = ddim_sampler.sample(ddim_steps, num_samples, shape, cond, verbose=False, eta=eta, unconditional_guidance_scale=scale, unconditional_conditioning=un_cond) if config.save_memory: model.low_vram_shift(is_diffusing=False) x_samples = model.decode_first_stage(samples) x_samples = (einops.rearrange(x_samples, 'b c h w -> b h w c') * 127.5 + 127.5).cpu().numpy().clip(0, 255).astype(np.uint8) results = [x_samples[i] for i in range(num_samples)] return [255 - detected_map] + results block = gr.Blocks().queue() with block: with gr.Row(): gr.Markdown("## Control Stable Diffusion with Canny Edge Maps") with gr.Row(): with gr.Column(): input_image = gr.Image(source='upload', type="numpy") prompt = gr.Textbox(label="Prompt") run_button = gr.Button(label="Run") with gr.Accordion("Advanced options", open=False): num_samples = gr.Slider(label="Images", minimum=1, maximum=12, value=1, step=1) image_resolution = gr.Slider(label="Image Resolution", minimum=256, maximum=768, value=512, step=64) strength = gr.Slider(label="Control Strength", minimum=0.0, maximum=2.0, value=1.0, step=0.01) guess_mode = gr.Checkbox(label='Guess Mode', value=False) low_threshold = gr.Slider(label="Canny low threshold", minimum=1, maximum=255, value=100, step=1) high_threshold = gr.Slider(label="Canny high threshold", minimum=1, maximum=255, value=200, step=1) ddim_steps = gr.Slider(label="Steps", minimum=1, maximum=100, value=20, step=1) scale = gr.Slider(label="Guidance Scale", minimum=0.1, maximum=30.0, value=9.0, step=0.1) seed = gr.Slider(label="Seed", minimum=-1, maximum=2147483647, step=1, randomize=True) eta = gr.Number(label="eta (DDIM)", value=0.0) a_prompt = gr.Textbox(label="Added Prompt", value='best quality, extremely detailed') n_prompt = gr.Textbox(label="Negative Prompt", value='longbody, lowres, bad anatomy, bad hands, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality') with gr.Column(): result_gallery = gr.Gallery(label='Output', show_label=False, elem_id="gallery").style(grid=2, height='auto') ips = [input_image, prompt, a_prompt, n_prompt, num_samples, image_resolution, ddim_steps, guess_mode, strength, scale, seed, eta, low_threshold, high_threshold] run_button.click(fn=process, inputs=ips, outputs=[result_gallery]) block.launch(server_name='0.0.0.0')其他的模型,比如gradio\_hed2image.py,它可以保留输入图像中的许多细节,适合图像的重新着色和样式化的场景:还记得AnimeGANv2模型吗:神工鬼斧惟肖惟妙,M1 mac系统深度学习框架Pytorch的二次元动漫动画风格迁移滤镜AnimeGANv2+Ffmpeg(图片+视频)快速实践,之前还只能通过统一模型滤镜进行转化,现在只要修改引导词,我们就可以肆意地变化出不同的滤镜,人工智能技术的发展,就像发情的海,汹涌澎湃。结语“人类嘛时候会被人工智能替代呀?” “就是现在!就在今天!”就算是达芬奇还魂,齐白石再生,他们也会被现今的人工智能AI技术所震撼,纵横恣肆的笔墨,抑扬变化的形态,左右跌宕的心气,焕然飞动的神采!历史长河中这一刻,大千世界里这一处,让我们变得疯狂! 最后奉上修改后的基于Python3.10的Cpu训练版本的ControlNet,与众亲同飨:https://github.com/zcxey2911/ControlNet\_py3.10\_cpu\_NoConda

玫瑰花变蚊子血,自动化无痕浏览器对比测试,新贵PlayWright Vs 老牌Selenium,基于Python3.10

也许每一个男子全都有过这样的两个女人,至少两个。娶了红玫瑰,久而久之,红的变了墙上的一抹蚊子血,白的还是床前明月光;娶了白玫瑰,白的便是衣服上沾的一粒饭黏子,红的却是心口上一颗朱砂痣。--张爱玲《红玫瑰与白玫瑰》Selenium一直都是Python开源自动化浏览器工具的王者,但这两年微软开源的PlayWright异军突起,后来者居上,隐隐然有撼动Selenium江湖地位之势,本次我们来对比PlayWright与Selenium之间的差异,看看曾经的玫瑰花Selenium是否会变成蚊子血。PlayWright的安装和使用PlayWright是由业界大佬微软(Microsoft)开源的端到端 Web 测试和自动化库,可谓是大厂背书,功能满格,虽然作为无头浏览器,该框架的主要作用是测试 Web 应用,但事实上,无头浏览器更多的是用于 Web 抓取目的,也就是爬虫。首先终端运行安装命令:pip3 install playwright程序返回:Successfully built greenlet Installing collected packages: pyee, greenlet, playwright Attempting uninstall: greenlet Found existing installation: greenlet 2.0.2 Uninstalling greenlet-2.0.2: Successfully uninstalled greenlet-2.0.2 Successfully installed greenlet-2.0.1 playwright-1.30.0 pyee-9.0.4目前最新稳定版为1.30.0 随后可以选择直接安装浏览器驱动:playwright install程序返回:Downloading Chromium 110.0.5481.38 (playwright build v1045) from https://playwright.azureedge.net/builds/chromium/1045/chromium-mac-arm64.zip 123.8 Mb [====================] 100% 0.0s Chromium 110.0.5481.38 (playwright build v1045) downloaded to /Users/liuyue/Library/Caches/ms-playwright/chromium-1045 Downloading FFMPEG playwright build v1008 from https://playwright.azureedge.net/builds/ffmpeg/1008/ffmpeg-mac-arm64.zip 1 Mb [====================] 100% 0.0s FFMPEG playwright build v1008 downloaded to /Users/liuyue/Library/Caches/ms-playwright/ffmpeg-1008 Downloading Firefox 108.0.2 (playwright build v1372) from https://playwright.azureedge.net/builds/firefox/1372/firefox-mac-11-arm64.zip 69.8 Mb [====================] 100% 0.0s Firefox 108.0.2 (playwright build v1372) downloaded to /Users/liuyue/Library/Caches/ms-playwright/firefox-1372 Downloading Webkit 16.4 (playwright build v1767) from https://playwright.azureedge.net/builds/webkit/1767/webkit-mac-12-arm64.zip 56.9 Mb [====================] 100% 0.0s Webkit 16.4 (playwright build v1767) downloaded to /Users/liuyue/Library/Caches/ms-playwright/webkit-1767默认会下载Chromium内核、Firefox以及Webkit驱动。 其中使用最广泛的就是基于Chromium内核的浏览器,最负盛名的就是Google的Chrome和微软自家的Edge。 确保当前电脑安装了Edge浏览器,让我们小试牛刀一把:from playwright.sync_api import sync_playwright import time with sync_playwright() as p: browser = p.chromium.launch(channel="msedge", headless=True) page = browser.new_page() page.goto('http:/v3u.cn') page.screenshot(path=f'./example-v3u.png') time.sleep(5) browser.close()这里导入sync\_playwright模块,顾名思义,同步执行,通过上下文管理器开启浏览器进程。 随后通过channel指定edge浏览器,截图后关闭浏览器进程: 我们也可以指定headless参数为True,让浏览器再后台运行:from playwright.sync_api import sync_playwright with sync_playwright() as p: browser = p.chromium.launch(channel="msedge", headless=True) page = browser.new_page() page.goto('http:/v3u.cn') page.screenshot(path=f'./example-v3u.png') browser.close()除了同步模式,PlayWright也支持异步非阻塞模式:import asyncio from playwright.async_api import async_playwright async def main(): async with async_playwright() as p: browser = await p.chromium.launch(channel="msedge", headless=False) page = await browser.new_page() await page.goto("http://v3u.cn") print(await page.title()) await browser.close() asyncio.run(main())可以通过原生协程库asyncio进行调用,PlayWright内置函数只需要添加await关键字即可,非常方便,与之相比,Selenium原生库并不支持异步模式,必须安装三方扩展才可以。 最炫酷的是,PlayWright可以对用户的浏览器操作进行录制,并且可以转换为相应的代码,在终端执行以下命令:python -m playwright codegen --target python -o 'edge.py' -b chromium --channel=msedge这里通过codegen命令进行录制,指定浏览器为edge,将所有操作写入edge.py的文件中: 与此同时,PlayWright也支持移动端的浏览器模拟,比如苹果手机:from playwright.sync_api import sync_playwright with sync_playwright() as p: iphone_13 = p.devices['iPhone 13 Pro'] browser = p.webkit.launch(headless=False) page = browser.new_page() page.goto('https://v3u.cn') page.screenshot(path='./v3u-iphone.png') browser.close()这里模拟Iphone13pro的浏览器访问情况。 当然了,除了UI功能测试,我们当然还需要PlayWright帮我们干点脏活累活,那就是爬虫:from playwright.sync_api import sync_playwright def extract_data(entry): name = entry.locator("h3").inner_text().strip("\n").strip() capital = entry.locator("span.country-capital").inner_text() population = entry.locator("span.country-population").inner_text() area = entry.locator("span.country-area").inner_text() return {"name": name, "capital": capital, "population": population, "area (km sq)": area} with sync_playwright() as p: # launch the browser instance and define a new context browser = p.chromium.launch() context = browser.new_context() # open a new tab and go to the website page = context.new_page() page.goto("https://www.scrapethissite.com/pages/simple/") page.wait_for_load_state("load") # get the countries countries = page.locator("div.country") n_countries = countries.count() # loop through the elements and scrape the data data = [] for i in range(n_countries): entry = countries.nth(i) sample = extract_data(entry) data.append(sample) browser.close()这里data变量就是抓取的数据内容:[ {'name': 'Andorra', 'capital': 'Andorra la Vella', 'population': '84000', 'area (km sq)': '468.0'}, {'name': 'United Arab Emirates', 'capital': 'Abu Dhabi', 'population': '4975593', 'area (km sq)': '82880.0'}, {'name': 'Afghanistan', 'capital': 'Kabul', 'population': '29121286', 'area (km sq)': '647500.0'}, {'name': 'Antigua and Barbuda', 'capital': "St. John's", 'population': '86754', 'area (km sq)': '443.0'}, {'name': 'Anguilla', 'capital': 'The Valley', 'population': '13254', 'area (km sq)': '102.0'}, ]基本上,该有的功能基本都有,更多功能请参见官方文档:https://playwright.dev/python/docs/librarySeleniumSelenium曾经是用于网络抓取和网络自动化的最流行的开源无头浏览器工具之一。在使用 Selenium 进行抓取时,我们可以自动化浏览器、与 UI 元素交互并在 Web 应用程序上模仿用户操作。Selenium 的一些核心组件包括 WebDriver、Selenium IDE 和 Selenium Grid。 关于Selenium的一些基本操作请移玉步至:python3.7爬虫:使用Selenium带Cookie登录并且模拟进行表单上传文件,这里不作过多赘述。 如同前文提到的,与Playwright相比,Selenium需要第三方库来实现异步并发执行,同时,如果需要录制动作视频,也需要使用外部的解决方案。就像Playwright那样,让我们使用 Selenium 构建一个简单的爬虫脚本。首先导入必要的模块并配置 Selenium 实例,并且通过设置确保无头模式处于活动状态option.headless = True:from selenium import webdriver from selenium.webdriver.chrome.service import Service from selenium.webdriver.common.by import By # web driver manager: https://github.com/SergeyPirogov/webdriver_manager # will help us automatically download the web driver binaries # then we can use `Service` to manage the web driver's state. from webdriver_manager.chrome import ChromeDriverManager def extract_data(row): name = row.find_element(By.TAG_NAME, "h3").text.strip("\n").strip() capital = row.find_element(By.CSS_SELECTOR, "span.country-capital").text population = row.find_element(By.CSS_SELECTOR, "span.country-population").text area = row.find_element(By.CSS_SELECTOR, "span.country-area").text return {"name": name, "capital": capital, "population": population, "area (km sq)": area} options = webdriver.ChromeOptions() options.headless = True # this returns the path web driver downloaded chrome_path = ChromeDriverManager().install() # define the chrome service and pass it to the driver instance chrome_service = Service(chrome_path) driver = webdriver.Chrome(service=chrome_service, options=options) url = "https://www.scrapethissite.com/pages/simple" driver.get(url) # get the data divs countries = driver.find_elements(By.CSS_SELECTOR, "div.country") # extract the data data = list(map(extract_data, countries)) driver.quit()数据返回:[ {'name': 'Andorra', 'capital': 'Andorra la Vella', 'population': '84000', 'area (km sq)': '468.0'}, {'name': 'United Arab Emirates', 'capital': 'Abu Dhabi', 'population': '4975593', 'area (km sq)': '82880.0'}, {'name': 'Afghanistan', 'capital': 'Kabul', 'population': '29121286', 'area (km sq)': '647500.0'}, {'name': 'Antigua and Barbuda', 'capital': "St. John's", 'population': '86754', 'area (km sq)': '443.0'}, {'name': 'Anguilla', 'capital': 'The Valley', 'population': '13254', 'area (km sq)': '102.0'}, ]性能测试在数据抓取量一样的前提下,我们当然需要知道到底谁的性能更好,是PlayWright,还是Selenium? 这里我们使用Python3.10内置的time模块来统计爬虫脚本的执行速度。 PlayWright:import time from playwright.sync_api import sync_playwright def extract_data(entry): name = entry.locator("h3").inner_text().strip("\n").strip() capital = entry.locator("span.country-capital").inner_text() population = entry.locator("span.country-population").inner_text() area = entry.locator("span.country-area").inner_text() return {"name": name, "capital": capital, "population": population, "area (km sq)": area} start = time.time() with sync_playwright() as p: # launch the browser instance and define a new context browser = p.chromium.launch() context = browser.new_context() # open a new tab and go to the website page = context.new_page() page.goto("https://www.scrapethissite.com/pages/") # click to the first page and wait while page loads page.locator("a[href='/pages/simple/']").click() page.wait_for_load_state("load") # get the countries countries = page.locator("div.country") n_countries = countries.count() data = [] for i in range(n_countries): entry = countries.nth(i) sample = extract_data(entry) data.append(sample) browser.close() end = time.time() print(f"The whole script took: {end-start:.4f}")Selenium:import time from selenium import webdriver from selenium.webdriver.chrome.service import Service from selenium.webdriver.common.by import By # web driver manager: https://github.com/SergeyPirogov/webdriver_manager # will help us automatically download the web driver binaries # then we can use `Service` to manage the web driver's state. from webdriver_manager.chrome import ChromeDriverManager def extract_data(row): name = row.find_element(By.TAG_NAME, "h3").text.strip("\n").strip() capital = row.find_element(By.CSS_SELECTOR, "span.country-capital").text population = row.find_element(By.CSS_SELECTOR, "span.country-population").text area = row.find_element(By.CSS_SELECTOR, "span.country-area").text return {"name": name, "capital": capital, "population": population, "area (km sq)": area} # start the timer start = time.time() options = webdriver.ChromeOptions() options.headless = True # this returns the path web driver downloaded chrome_path = ChromeDriverManager().install() # define the chrome service and pass it to the driver instance chrome_service = Service(chrome_path) driver = webdriver.Chrome(service=chrome_service, options=options) url = "https://www.scrapethissite.com/pages/" driver.get(url) # get the first page and click to the link first_page = driver.find_element(By.CSS_SELECTOR, "h3.page-title a") first_page.click() # get the data div and extract the data using beautifulsoup countries_container = driver.find_element(By.CSS_SELECTOR, "section#countries div.container") countries = driver.find_elements(By.CSS_SELECTOR, "div.country") # scrape the data using extract_data function data = list(map(extract_data, countries)) end = time.time() print(f"The whole script took: {end-start:.4f}") driver.quit()测试结果: Y轴是执行时间,一望而知,Selenium比PlayWright差了大概五倍左右。红玫瑰还是白玫瑰?不得不承认,Playwright 和 Selenium 都是出色的自动化无头浏览器工具,都可以完成爬虫任务。我们还不能断定那个更好一点,所以选择那个取决于你的网络抓取需求、你想要抓取的数据类型、浏览器支持和其他考虑因素:Playwright 不支持真实设备,而 Selenium 可用于真实设备和远程服务器。Playwright 具有内置的异步并发支持,而 Selenium 需要第三方工具。Playwright 的性能比 Selenium 高。Selenium 不支持详细报告和视频录制等功能,而 Playwright 具有内置支持。Selenium 比 Playwright 支持更多的浏览器。Selenium 支持更多的编程语言。结语如果您看完了本篇文章,那么到底谁是最好的无头浏览器工具,答案早已在心间,所谓强中强而立强,只有弱者才害怕竞争,相信PlayWright的出现会让Selenium变为更好的自己,再接再厉,再创辉煌。

上古神兵,先天至宝,Win11平台安装和配置NeoVim0.8.2编辑器搭建Python3开发环境(2023最新攻略)

毫无疑问,我们生活在编辑器的最好年代,Vim是仅在Vi之下的神级编辑器,而脱胎于Vim的NeoVim则是这个时代最好的编辑器,没有之一。异步支持、更好的内存管理、更快的渲染速度、更多的编辑命令,是大神Thiago de Arruda对开发者们最好的技术馈赠。之前一篇:Win10系统下安装编辑器之神(The God of Editor)Vim并且构建Python生态开发环境(2020年最新攻略),我们已经领略了Vim的魅力,但时代不同了,繁琐的配置,差强人意的性能,很难不让人把目光投向NeoVim,正所谓江山代有人才出,一代更比一代强。安装配置首先去Github项目官网下载最新稳定版0.8.2:https://github.com/neovim/neovim/releases/tag/stable,选择Windows64位的压缩包文件:nvim-win64.zip,下载成功后,无须安装,解压安装包,放入合适的目录中,比如 C:\nvim-win64中。NeoVim有两个启动程序,分别是nvim-qt.exe和nvim.exe,前者是基于Gui的客户端,后者则基于终端Terminal,解压之后,最好将bin目录配置到系统的环境变量:C:\nvim-win64\nvim-win64\bin ,如此,我们就可以在系统的任意位置启动NeoVim。 随后我们安装基于异步方法的插件管理工具:vim-plug。 首先,在vim-plug首页:https://github.com/junegunn/vim-plug 下载plug.vim配置文件,随后将其复制到到C:\Users\liuyue\AppData\Local\nvim\autoload下,如果没有这个文件夹,就自己建一个nvim\autoload文件夹。这里需要注意的是AppData目录默认是隐藏的,需要在windows目录选项中开启显示隐藏目录。 其后,在C:\Users\liuyue\AppData\Local\nvim\目录中建立NeoVim的初始化配置init.vim:call plug#begin('C:\nvim-win64\nvim-win64\share\nvim\plugged) "插件列表 call plug#end()这里首行是插件的安装目录,随后只要把想要安装的插件写入到两个call关键字之间即可。 至此,NeoVim的安装就完成了。第一个NeoVim插件第一个NeoVim插件我们从主题入手,毕竟个性化是最不能被忽略的需求,这里主题推荐邪魅狂狷的One Dark主题:https://github.com/navarasu/onedark.nvim 修改init.vim配置:call plug#begin('C:\nvim-win64\nvim-win64\share\nvim\plugged') Plug 'navarasu/onedark.nvim' call plug#end() let g:onedark_config = { \ 'style': 'warm', colorscheme onedark这里添加Plug 'navarasu/onedark.nvim'插件,随后通过:let g:onedark_config = { \ 'style': 'warm', colorscheme onedark对NeoVim的主题进行设置,保存之后,在终端启动NeoVim:nvim test.py发现主题并未发生变化:那是因为插件必须先进行安装,在命令模式输入::PlugInstall随后重启nvim: One Dark 主题跃然纸上。目录管理目录管理插件可以让开发者迅速地操作项目目录中的代码,这里推荐使用https://github.com/pablopunk/native-sidebar.vim ,简单方便,开箱可用:call plug#begin('C:\nvim-win64\nvim-win64\share\nvim\plugged') Plug 'navarasu/onedark.nvim' Plug 'pablopunk/native-sidebar.vim' call plug#end() let g:onedark_config = { \ 'style': 'warm', colorscheme onedark let g:native_sidebar_shortcut = '<c-t>'这里我们通过control+t来开启左侧目录树: 终端配置Windows11系统默认采用的还是Win10时代丑陋的CMD终端风格,但其实,Windows11也默认预装了最新的Windows Terminal终端。 首先按视窗建+R,输入wt 第一次启动Windows Terminal: 在终端窗口中点击下拉菜单,找到设置选项。 默认终端应用程序可以修改为 Windows Terminal,这样启动CMD时就是Windows Terminal 终端窗口了: 如此,NeoVim的字体风格就可以继承Windows Terminal的新风格了。Python代码补全配置用NeoVim来写Python代码,就会有代码补全的需求,业内比较流行的插件是jedi-vim:https://github.com/davidhalter/jedi-vim。 jedi-vim针对开发者的需求,编写如语法增强、文档查看、自动补全等各类功能,并且进行了重构和集成,提供了开箱即用的统一解决方案,一经推出便广受好评,成为使用 Vim 进行 Python 开发的标配。但是jedi-vim虽然开箱即用,但却是一坨杂乱的乱炖,不仅随着项目功能的增加变得越发庞大和迟缓(有点类似著名的node-moudles),代码的可读性也非常糟糕,难以维护和参与。 所以这里推荐性能更优越的ncm2,一个异步自动补全框架:https://github.com/ncm2/ncm2 首先安装相关依赖:python3 -m pip install pynvim python3 -m pip install jedi pip3 install neovim --upgrade 随后编写配置:call plug#begin('C:\nvim-win64\nvim-win64\share\nvim\plugged') Plug 'navarasu/onedark.nvim' Plug 'pablopunk/native-sidebar.vim' Plug 'ncm2/ncm2' Plug 'roxma/nvim-yarp' Plug 'ncm2/ncm2-bufword' Plug 'ncm2/ncm2-path' Plug 'ncm2/ncm2-jedi' call plug#end() let g:onedark_config = { \ 'style': 'warm', colorscheme onedark autocmd BufEnter * call ncm2#enable_for_buffer() " IMPORTANT: :help Ncm2PopupOpen for more information set completeopt=noinsert,menuone,noselect let g:native_sidebar_shortcut = '<c-t>'主要依赖这几个插件:Plug 'ncm2/ncm2' Plug 'roxma/nvim-yarp' Plug 'ncm2/ncm2-bufword' Plug 'ncm2/ncm2-path' Plug 'ncm2/ncm2-jedi'随后开启NeoVim进行安装::PlugInstall重启NeoVim: 看起来还不错吧? 最后,继续修改配置,让NeoVim可以直接编译运行Python代码:nnoremap <C-B> :sp <CR> :term python % <CR> nnoremap <C-W> :bd!<CR>这里通过control+b快捷键组合来编译运行,control+w组合键关闭弹窗: 轻量化、简单、快速,让普通小白也能玩得起来,这就是在Win11下用NeoVim编写Python的乐趣,奉上笔者的NeoVim完整配置:call plug#begin('C:\nvim-win64\nvim-win64\share\nvim\plugged') Plug 'navarasu/onedark.nvim' Plug 'pablopunk/native-sidebar.vim' Plug 'ncm2/ncm2' Plug 'roxma/nvim-yarp' Plug 'ncm2/ncm2-bufword' Plug 'ncm2/ncm2-path' Plug 'ncm2/ncm2-jedi' call plug#end() let g:onedark_config = { \ 'style': 'warm', colorscheme onedark autocmd BufEnter * call ncm2#enable_for_buffer() " IMPORTANT: :help Ncm2PopupOpen for more information set completeopt=noinsert,menuone,noselect let g:native_sidebar_shortcut = '<c-t>' set clipboard^=unnamed,unnamedplus syntax on "syntax highlighting, see :help syntax filetype plugin indent on "file type detection, see :help filetype set number "display line number set path+=** "improves searching, see :help path set noswapfile "disable use of swap files set wildmenu "completion menu set backspace=indent,eol,start "ensure proper backspace functionality set undodir=~/.cache/nvim/undo "undo ability will persist after exiting file set undofile "see :help undodir and :help undofile set incsearch "see results while search is being typed, see :help incsearch set smartindent "auto indent on new lines, see :help smartindent set ic "ignore case when searching set expandtab "expanding tab to spaces set tabstop=4 "setting tab to 4 columns set shiftwidth=4 "setting tab to 4 columns set softtabstop=4 "setting tab to 4 columns set showmatch "display matching bracket or parenthesis set hlsearch incsearch "highlight all pervious search pattern with incsearch highlight ColorColumn ctermbg=9 "display ugly bright red bar at color column number " Keybind Ctrl+l to clear search nnoremap <C-l> :nohl<CR><C-l>:echo "Search Cleared"<CR> " When python filetype is detected, F5 can be used to execute script " autocmd FileType python nnoremap <buffer> <c-b> :<cr>:exec '!python' shellescape(expand('%:p'), 1)<cr> nnoremap <C-B> :sp <CR> :term python % <CR> nnoremap <C-W> :bd!<CR>结语NeoVim是Vim的精神复刻与肉体重生,承袭了Vim的所有操作技巧,假如我们说,二十一世纪以来编辑器领域有什么经典软件,无疑的,我们应该说,Vim和NeoVim是两个颠扑不破的巨石重镇,没有了它们,编辑器史上便要黯然失光。最后,奉上项目配置地址,与君共觞:https://github.com/zcxey2911/Win11-neovim0.8.2-config-Python

影片自由,丝滑流畅,Docker容器基于WebDav协议通过Alist挂载(百度网盘/阿里云盘)Python3.10接入

使用过NAS(Network Attached Storage)的朋友都知道,它可以通过局域网将本地硬盘转换为局域网内的“网盘”,简单理解就是搭建自己的“私有云”,但是硬件和网络成本都太高了,有点可望而不可及的意思。Alist开源库则可以满足我们,它能将公共网盘反过来变成一种联网的本地硬盘,使用Web页面来统一挂载和管理,网盘类型包含但不限于:百度网盘、阿里云盘、迅雷网盘等等。Alist挂载网盘的另外一个好处是可以基于WebDav协议直接播放网盘资源,虽然说网盘也支持在线播放功能,但是代价就是得充会员,没错,这符合逻辑,网盘主机厂也得盈利,但Alist技术可以帮助我们曲线救国,节省一笔开支。 此外,使用WebDAV的精髓在于WebDAV可以被挂载为一个本地(服务器)磁盘,正因为WebDAV可以被映射为一个本地目录,所以只需要调用本地播放器或者本地搭载的浏览器播放器进行播放。无论是mkv、wmv或是h.265编码方案,通过一个现代的本地播放器都能完美的播放,不存在需要转码的情况,所以,使用WebDAV协议,服务器的负担只有传输数据这一个任务。Docker部署AlistAlist软件可以通过多种方式进行安装和部署,但最方便的,还是通过Docker,主要是因为由于各大网盘主机厂的网盘版本更新频率很快,所以Alist的版本也会随之频繁更新,而Docker的操作最简单快捷,只需要简单的命令就可以完成部署,更适合这种频繁更新的情况。 关于Docker请移玉步至一寸宕机一寸血,十万容器十万兵|Win10/Mac系统下基于Kubernetes(k8s)搭建Gunicorn+Flask高可用Web集群,这里不作过多赘述。 首先在终端执行命令:docker run -d --restart=always -v /etc/alist:/opt/alist/data -p 5244:5244 -e PUID=0 -e PGID=0 -e UMASK=022 --name="alist" xhofe/alist:latest该命令会在后台生成一个Alist容器,服务运行在系统的5244端口,如果是首次运行,会拉取最新的Alist镜像:➜ interview git:(main) docker run -d --restart=always -v /etc/alist:/opt/alist/data -p 5244:5244 -e PUID=0 -e PGID=0 -e UMASK=022 --name="alist" xhofe/alist:latest Unable to find image 'xhofe/alist:latest' locally latest: Pulling from xhofe/alist b1101342f8ad: Pull complete d9f5c37d20f9: Pull complete 5f4a1655e3cc: Pull complete c1e599f8ce92: Pull complete d613bea8ea45: Pull complete Digest: sha256:520e531ddaf5732c4944d5c35ad4dbb601e2fadae14b99a81e86ea3f7e065173 Status: Downloaded newer image for xhofe/alist:latest 7bf1c7f384526bd22aa078223d548ab0c16b79c245919e8a0cf7b439e79f34d6随后执行命令:docker ps就可以看到正在运行的Alist服务容器:➜ ~ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 7bf1c7f38452 xhofe/alist:latest "/entrypoint.sh" 3 hours ago Up 3 hours 0.0.0.0:5244->5244/tcp alist ➜ ~Alist服务平台基于前后端分离的Gin和React,所以平台管理页面需要用户名和密码才能登入,输入命令:docker exec -it alist ./alist admin该命令会进入容器并展示账号和密码:INFO[2023-02-13 22:54:17] admin user's info: username: admin password: 8U5js3bH记录下来,注意这是本地的服务,所以外网是无法进行登录的。 至此,Alist的本地部署就完成了,假如Alist发了新的版本,也可以通过下面的命令进行更新操作:docker stop alist #停止alist容器 docker rm -f alist #删除alist容器,因为之前映射到了本地,所以数据不会被删除 cp -r /root/data/docker_data/alist /root/data/docker_data/alist.bak #可选,如果不放心,可以备份一下数据 docker pull xhofe/alist:latest #拉取最新的alist镜像 docker run -d --restart=always -v /root/data/docker_data/alist:/opt/alist/data -p 5244:5244 --name="alist" xhofe/alist:latest #运行安装命令,注意-v挂载的路径与原来相同这里的区别就是通过挂载命令将alist的配置文件挂载到宿主机的/root/data/docker\_data/alist目录,方便升级后进行使用。挂载百度网盘部署好Alist服务后,访问本地网址进行登录:http://localhost:5244/@manage 用户名和密码就是上文中Docker中返回的,登录成功后,选择左侧菜单中的存储,添加百度网盘: 百度云盘的操作完全基于百度云的开放API,只要给Alist授权操作接口的权限即可,进入网址:https://tool.nn.ci/baidu/callback?code=288faa8f669a3d174ea9af0bd1d72ab5 进行授权操作,记录client\_id、client\_secret和refresh\_token,分别将三个参数填入挂载的表单中,然后挂载目录填入根目录:/,注意表单中最好把web代理选项勾选。随后进入Alist服务首页:http://localhost:5244,就可以在线播放百度云内存储的资源:非常方便。挂载阿里云盘截止到本文发布的2-14号,阿里云盘目前挂载过程中会出现设备id的bug,但是挂载阿里云盘分享的网盘还是没问题的,由于阿里云盘操作基于客户端的token,所以必须先通过移动端登录页面来获取token: https://passport.aliyundrive.com/mini\_login.htm?lang=zh\_cn&appName=aliyun\_drive&appEntrance=web&styleType=auto&bizParams=¬LoadSsoView=false¬KeepLogin=false&isMobile=true&hidePhoneCode=true&rnd=0.9186864872885723登录成功后,通过抓包,获取后端login.do接口的返回值: 将bizExt的值复制出来,然后利用Python的Base64模块进行解码操作:import base64 coded_string = '''Q5YACgA...''' base64.b64decode(coded_string)解码出来的refreshToken就是我们需要的令牌:"refreshToken":"sdfdsfsdfdsfb9fadd4f62ee4be968e"随后在后台将token和分享的id填入表单即可: 注意这里挂载路径不能填入根目录/,因为之前我们已经挂载了百度网盘了,所以选择一个子目录share。 至此,阿里云盘分享就挂载好了,可以坐下来,犒劳自己了: Python3.10接入除了在线播放,我们还可以使用Python3.10直接通过WebDav协议操作Alist挂载的网盘,可谓是神乎其技了。 首先安装WebDav库:pip3 install webdavclient3随后编写webdav.py文件:from webdav3.client import Client options = { 'webdav_hostname': "http://localhost:5244/dav", 'webdav_login': "admin", 'webdav_password': "8U5js3bH" client = Client(options) client.verify = False # To not check SSL certificates (Default = True) files1 = client.list() print(files1)这里的webdav\_hostname指的是刚才用docker挂载的webdav服务路径,账号和密码是上文中docker返回的,不用担心外泄,因为是本地服务。 程序返回:➜ gotest /opt/homebrew/bin/python3.10 "/Users/liuyue/wodfan/work/gotest/webdav.py" ['dav/', 'aliyunpan/', 'The.Last.of.Us.S01E03.1080p.WEB-DL.DDP5.1.Atmos.H.264-Q66.mkv', 'The.Last.of.Us.S01E05.1080p.WEB-DL.DDP5.1.Atmos.H.264-Q66.mkv', 'The.Last.of.Us.S01E04.1080p.WEB-DL.DDP5.1.Atmos.H.264-Q66.mkv', 'house.of.the.dragon.s01e08.1080p.web.h264-cakes.chs.eng.mp4', 'House.of.the.Dragon.S01E07.Driftmark.1080p.HMAX.WEB-DL.DDP5.1.Atmos.H.264-SMURF.chs.eng.mp4', 'House.of.the.Dragon.S01E06.The.Princess.and.the.Queen.720p.HMAX.WEB-DL.DDP5.1.H.264-NTb.chs.eng.mp4', 'House.of.the.Dragon.S01E05.We.Light.the.Way.1080p.HMAX.WEB-DL.DDP5.1.Atmos.H.264-SMURF.chs.eng.mp4', 'house.of.the.dragon.s01e04.720p.web.h264-cakes.chs.eng.mp4', 'house.of.the.dragon.s01e03.720p.web.h264-cakes.chs.eng.mp4', 'share/']可以很方便的将挂在后的网盘文件目录进行返回。 除此之外,我们也可以针对网盘资源进行增删改查的动态操作:# Create directory client.mkdir("dir1/dir2") # Delete resource client.clean("dir1/dir2") # Copy resource client.copy(remote_path_from="dir1/file1", remote_path_to="dir2/file1") client.copy(remote_path_from="dir2", remote_path_to="dir3") # Move resource client.move(remote_path_from="dir1/file1", remote_path_to="dir2/file1") client.move(remote_path_from="dir2", remote_path_to="dir3") # Download a resource client.download_sync(remote_path="dir1/file1", local_path="~/Downloads/file1") client.download_sync(remote_path="dir1/dir2/", local_path="~/Downloads/dir2/") # Upload resource client.upload_sync(remote_path="dir1/file1", local_path="~/Documents/file1") client.upload_sync(remote_path="dir1/dir2/", local_path="~/Documents/dir2/")也就是说,只要Alist服务已经挂载好网盘,我们甚至不需要平台界面,只编写代码就可以对网盘资源予取予求。结语旧时王谢堂前燕,飞入寻常百姓家。只要一台联网的电脑,就可以实现自己的“私有云”,成本低到令人发指,Alist,新时代的普罗米修斯,为我们带来了网盘自由的火种。

闻其声而知雅意,M1 Mac基于PyTorch(mps/cpu/cuda)的人工智能AI本地语音识别库Whisper(Python3.10)

前文回溯,之前一篇:含辞未吐,声若幽兰,史上最强免费人工智能AI语音合成TTS服务微软Azure(Python3.10接入),利用AI技术将文本合成语音,现在反过来,利用开源库Whisper再将语音转回文字,所谓闻其声而知雅意。Whisper 是一个开源的语音识别库,它是由Facebook AI Research (FAIR)开发的,支持多种语言的语音识别。它使用了双向循环神经网络(bi-directional RNNs)来识别语音并将其转换为文本。 Whisper支持自定义模型,可以用于实现在线语音识别,并且具有高级的语音识别功能,支持语音识别中的语音活动检测和语音识别中的语音转文本。它是使用PyTorch进行开发,可以使用Python API来调用语音识别,并且提供了一系列的预训练模型和数据集来帮助用户开始使用。PyTorch基于MPS的安装我们知道PyTorch一直以来在M芯片的MacOs系统中都不支持cuda模式,而现在,新的MPS后端扩展了PyTorch生态系统并提供了现有的脚本功能来在 GPU上设置和运行操作。 截止本文发布,PyTorch与Python 3.11不兼容,所以我们将使用最新的 3.10.x 版本。 确保安装Python3.10最新版:➜ transformers git:(stable) python3 Python 3.10.9 (main, Dec 15 2022, 17:11:09) [Clang 14.0.0 (clang-1400.0.29.202)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>>随后运行安装命令:pip3 install --pre torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/nightly/cpu安装成功后,在终端里验证PyTorch-MPS的状态:➜ transformers git:(stable) python3 Python 3.10.9 (main, Dec 15 2022, 17:11:09) [Clang 14.0.0 (clang-1400.0.29.202)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import torch >>> torch.backends.mps.is_available() >>>返回True即可。PyTorch MPS (Multi-Process Service)性能测试PyTorch MPS (Multi-Process Service)是 PyTorch 中的一种分布式训练方式。它是基于Apple的MPS(Metal Performance Shaders) 框架开发的。MPS可以在多核的苹果设备上加速tensor的运算。MPS使用了多个设备上的多个核心来加速模型的训练。它可以将模型的计算过程分配到多个核心上,并且可以在多个设备上进行训练,从而提高训练速度。PyTorch MPS 可以在 Apple 的设备(如 iPhone 和 iPad)上加速模型训练,也可以在 Mac 上使用。可以使用MPS来加速卷积神经网络(CNNs)、循环神经网络(RNNs)和其他类型的神经网络。使用MPS可以在不改变模型结构的情况下,通过分布式训练来加速模型的训练速度。现在我们来做一个简单测试:import torch import timeit import random x = torch.ones(50000000,device='cpu') print(timeit.timeit(lambda:x*random.randint(0,100),number=1))首先创建一个大小为 50000000 的全为1的张量 x,并将其设置为在cpu上运算。最后使用 timeit.timeit 函数来测量在 CPU 上执行 x 乘以一个随机整数的时间。 number=1表示只运行一次。这段代码的作用是在cpu上测量运算一个张量的时间。运行结果:➜ nlp_chinese /opt/homebrew/bin/python3.10 "/Users/liuyue/wodfan/work/nlp_chinese/mps_test.py" 0.020812375005334616在10核M1pro的cpu芯片加持下,运行时间为:0.020812375005334616随后换成MPS模式:import torch import timeit import random x = torch.ones(50000000,device='mps') print(timeit.timeit(lambda:x*random.randint(0,100),number=1))程序返回:➜ nlp_chinese /opt/homebrew/bin/python3.10 "/Users/liuyue/wodfan/work/nlp_chinese/mps_test.py" 0.00305804191157221816核的GPU仅用时:0.003058041911572218 也就是说MPS的运行速度比CPU提升了7倍左右。Whisper语音识别安装好了PyTorch,我们安装Whisper:pip install --upgrade --no-deps --force-reinstall git+https://github.com/openai/whisper.git安装好之后进行验证:➜ transformers git:(stable) whisper usage: whisper [-h] [--model {tiny.en,tiny,base.en,base,small.en,small,medium.en,medium,large}] [--model_dir MODEL_DIR] [--device DEVICE] [--output_dir OUTPUT_DIR] [--verbose VERBOSE] [--task {transcribe,translate}] [--language {af,am,ar,as,az,ba,be,bg,bn,bo,br,bs,ca,cs,cy,da,de,el,en,es,et,eu,fa,fi,fo,fr,gl,gu,ha,haw,hi,hr,ht,hu,hy,id,is,it,iw,ja,jw,ka,kk,km,kn,ko,la,lb,ln,lo,lt,lv,mg,mi,mk,ml,mn,mr,ms,mt,my,ne,nl,nn,no,oc,pa,pl,ps,pt,ro,ru,sa,sd,si,sk,sl,sn,so,sq,sr,su,sv,sw,ta,te,tg,th,tk,tl,tr,tt,uk,ur,uz,vi,yi,yo,zh,Afrikaans,Albanian,Amharic,Arabic,Armenian,Assamese,Azerbaijani,Bashkir,Basque,Belarusian,Bengali,Bosnian,Breton,Bulgarian,Burmese,Castilian,Catalan,Chinese,Croatian,Czech,Danish,Dutch,English,Estonian,Faroese,Finnish,Flemish,French,Galician,Georgian,German,Greek,Gujarati,Haitian,Haitian Creole,Hausa,Hawaiian,Hebrew,Hindi,Hungarian,Icelandic,Indonesian,Italian,Japanese,Javanese,Kannada,Kazakh,Khmer,Korean,Lao,Latin,Latvian,Letzeburgesch,Lingala,Lithuanian,Luxembourgish,Macedonian,Malagasy,Malay,Malayalam,Maltese,Maori,Marathi,Moldavian,Moldovan,Mongolian,Myanmar,Nepali,Norwegian,Nynorsk,Occitan,Panjabi,Pashto,Persian,Polish,Portuguese,Punjabi,Pushto,Romanian,Russian,Sanskrit,Serbian,Shona,Sindhi,Sinhala,Sinhalese,Slovak,Slovenian,Somali,Spanish,Sundanese,Swahili,Swedish,Tagalog,Tajik,Tamil,Tatar,Telugu,Thai,Tibetan,Turkish,Turkmen,Ukrainian,Urdu,Uzbek,Valencian,Vietnamese,Welsh,Yiddish,Yoruba}]随后安装ffmpeg:brew install ffmpeg然后编写语音识别代码:import whisper model = whisper.load_model("small") # load audio and pad/trim it to fit 30 seconds audio = whisper.load_audio("/Users/liuyue/wodfan/work/mydemo/b1.wav") audio = whisper.pad_or_trim(audio) # make log-Mel spectrogram and move to the same device as the model mel = whisper.log_mel_spectrogram(audio).to("cpu") # detect the spoken language _, probs = model.detect_language(mel) print(f"Detected language: {max(probs, key=probs.get)}") # decode the audio options = whisper.DecodingOptions(fp16 = False) result = whisper.decode(model, mel, options) # print the recognized text print(result.text)这里导入音频后,通过whisper.log\_mel\_spectrogram方法自动检测语言,然后输出文本:➜ minGPT git:(master) ✗ /opt/homebrew/bin/python3.10 "/Users/liuyue/wodfan/work/minGPT/wisper_test.py" Detected language: zh Hello大家好,这里是刘悦的技术博客,众神殿内,高朋满座,圣有如云,VMware,Virtual Box,UPM等虚拟机大神群英汇翠,指见位于C位王座上的Parallels唱网抬头,缓缓群寻,屁腻群小,目光到处,无人敢抬头对视。是的,如果说虚拟机领域有一位王者,非Parallels不能领袖群伦,毕竟大厂背书,功能满格,美中不足之处就是价格略高,这里使用的small模型,也可以用更大的模型比如:medium、large。模型越大,效果越好。 如果想使用MPS的方式,需要改写一下Whisper源码,将load\_model方法的参数改为mps即可:def load_model(name: str, device: Optional[Union[str, torch.device]] = None, download_root: str = None, in_memory: bool = False) -> Whisper: Load a Whisper ASR model Parameters ---------- name : str one of the official model names listed by `whisper.available_models()`, or path to a model checkpoint containing the model dimensions and the model state_dict. device : Union[str, torch.device] the PyTorch device to put the model into download_root: str path to download the model files; by default, it uses "~/.cache/whisper" in_memory: bool whether to preload the model weights into host memory Returns ------- model : Whisper The Whisper ASR model instance if device is None: device = "cuda" if torch.cuda.is_available() else "mps"代码在第18行。 随后运行脚本也改成mps:import whisper model = whisper.load_model("medium") # load audio and pad/trim it to fit 30 seconds audio = whisper.load_audio("/Users/liuyue/wodfan/work/mydemo/b1.wav") audio = whisper.pad_or_trim(audio) # make log-Mel spectrogram and move to the same device as the model mel = whisper.log_mel_spectrogram(audio).to("mps") # detect the spoken language _, probs = model.detect_language(mel) print(f"Detected language: {max(probs, key=probs.get)}") # decode the audio options = whisper.DecodingOptions(fp16 = False) result = whisper.decode(model, mel, options) # print the recognized text print(result.text)这回切换为medium模型,程序返回:➜ minGPT git:(master) ✗ /opt/homebrew/bin/python3.10 "/Users/liuyue/wodfan/work/minGPT/wisper_test.py" 100%|█████████████████████████████████████| 1.42G/1.42G [02:34<00:00, 9.90MiB/s] Detected language: zh Hello 大家好,这里是刘悦的技术博客,众神殿内,高朋满座,圣有如云,VMware,Virtualbox,UTM等虚拟机大神群音惠翠,只见位于C位王座上的Parallels唱往抬头,缓缓轻寻,屁逆群小,目光到处,无人敢抬头对视。效率和精准度提升了不少,但medium模型的体积也更大,达到了1.42g。结语Whisper作为一个开源的语音识别库,支持多种语言,并且使用双向循环神经网络(bi-directional RNNs)来识别语音并将其转换为文本,支持自定义模型,可以用于实现在线语音识别,并且具有高级的语音识别功能,支持语音识别中的语音活动检测和语音识别中的语音转文本,在PyTorch的MPS加成下,更是猛虎添翼,绝世好库,值得拥有。

防微杜渐,未雨绸缪,百度网盘(百度云盘)接口API自动化备份上传以及开源发布,基于Golang1.18

奉行长期主义的开发者都有一个共识:对于服务器来说,数据备份非常重要,因为服务器上的数据通常是无价的,如果丢失了这些数据,可能会导致严重的后果,伴随云时代的发展,备份技术也让千行百业看到了其“云基因”的成长与进化,即基于云存储的云备份。 本次我们使用Golang1.18完成百度网盘(百度云盘)接口API自动化备份上传功能,以及演示如何将该模块进行开源发布。百度网盘API接入授权如果希望golang服务可以访问并且上传用户的百度网盘,则需要经过用户同意,这个流程被称为“授权”。百度网盘开放平台基于 OAuth2.0 接入授权。OAuth2.0 是一种授权协议,通过该协议用户可以授权开发者应用访问个人网盘信息与文件。 用户同意授权后,开发者应用会获取到一个 Access Token,该 Access Token 是用户同意授权的凭证。开发者应用需要依赖 Access Token 凭证调用百度网盘公开API,实现访问用户网盘信息与授权资源。基本流程和三方登录差不多,需要跳转百度网盘授权页进行授权动作,随后授权码(code)会发送到回调网址,再用授权码换取Access Token。但不一样的是,百度官网提供一种相对简单的获取code方式,即oob,所谓oob就是直接在线请求后在表单中复制授权码即可,不需要回调网址的参与。 首先根据官网文档:https://pan.baidu.com/union/doc/ol0rsap9s 创建应用,创建好之后,将应用id拼接位oob授权网址:https://openapi.baidu.com/oauth/2.0/authorize?client_id=你的应用id&response_type=code&redirect_uri=oob&scope=basic+netdisk在线访问复制授权码: 注意授权码一次性有效并且会在10分钟后过期,随后编写代码获取token:package bdyp import ( "fmt" "net/http" "net/url" type Bcloud struct { app_key string app_secret string accessToken string refreshToken string logger Logger type tokenResp struct { *Token ErrorDescription string `json:"error_description"` type Token struct { AccessToken string `json:"access_token"` RefreshToken string `json:"refresh_token"` ExpiresIn int `json:"expires_in"` func (r *Bcloud) GetToken(code, redirectURI, app_key, app_secret string) (*Token, error) { uri := fmt.Sprintf("https://openapi.baidu.com/oauth/2.0/token?"+ "grant_type=authorization_code&"+ "code=%s&"+ "client_id=%s&"+ "client_secret=%s&"+ "redirect_uri=%s", url.QueryEscape(code), url.QueryEscape(app_key), url.QueryEscape(app_secret), redirectURI) resp := new(tokenResp) err := r.requestJSON(http.MethodGet, uri, nil, resp) if err != nil { return nil, err } else if resp.ErrorDescription != "" { return nil, fmt.Errorf(resp.ErrorDescription) r.app_key = app_key r.app_secret = app_secret r.accessToken = resp.AccessToken r.refreshToken = resp.RefreshToken return resp.Token, nil }这里分别创建网盘结构体和秘钥结构体,通过官方接口将oob方式获取的code交换token,分别为accessToken和refreshToken,refreshToken用于刷新 Access Token, 有效期为10年。这里最好将token写入文件或者存入数据库,本文只讨论授权和上传逻辑,故不加入数据库的相关操作。 至此,百度网盘的授权操作就完成了。服务器本地文件上传至百度网盘根据官网文档描述:https://pan.baidu.com/union/doc/3ksg0s9ye,上传流程是指,用户将本地文件上传到百度网盘云端服务器的过程。文件上传分为三个阶段:预上传、分片上传、创建文件。第二个阶段分片上传依赖第一个阶段预上传的结果,第三个阶段创建文件依赖第一个阶段预上传和第二阶段分片上传的结果,串行完成这三个阶段任务后,本地文件成功上传到网盘服务器。 说白了,有点像HTTP连接的三次握手,目的就是为了保证上传数据的完整性,强制串行的原子操作也有利于保证上传任务的可靠性。 首先构建预上传函数:func (r *Bcloud) FileUploadSessionStart(req *FileUploadSessionStartReq) (*FileUploadSessionStartResp, error) { token, err := r.getAuthToken() if err != nil { return nil, err req.Method = "precreate" req.AccessToken = token req_, err := req.to() if err != nil { return nil, err resp := new(FileUploadSessionStartResp) err = r.requestURLEncode(http.MethodPost, "https://pan.baidu.com/rest/2.0/xpan/file", req_, resp) if err != nil { return nil, err } else if err := resp.Err(); err != nil { return nil, err if len(resp.BlockList) == 0 { resp.BlockList = []int64{0} return resp, nil }这里参数为预上传参数的结构体:type FileUploadSessionStartReq struct { Method string `query:"method"` AccessToken string `query:"access_token"` Path string `json:"path"` File io.Reader RType *int64 `json:"rtype"` }随后是分片上传逻辑:func (r *Bcloud) FileUploadSessionAppend(req *FileUploadSessionAppendReq) error { token, err := r.getAuthToken() if err != nil { return err req.Method = "upload" req.AccessToken = token req.Type = "tmpfile" resp := new(fileUploadSessionAppendResp) err = r.requestForm(http.MethodPost, "https://d.pcs.baidu.com/rest/2.0/pcs/superfile2", req, resp) if err != nil { return err } else if err := resp.Err(); err != nil { return err } else if resp.ErrorMsg != "" { return fmt.Errorf(resp.ErrorMsg) return nil type FileUploadSessionAppendReq struct { Method string `query:"method"` // 本接口固定为precreate AccessToken string `query:"access_token"` Type string `query:"type"` // 固定值 tmpfile Path string `query:"path"` // 需要与上一个阶段预上传precreate接口中的path保持一致 UploadID string `query:"uploadid"` // 上一个阶段预上传precreate接口下发的uploadid PartSeq int64 `query:"partseq"` // 文件分片的位置序号,从0开始,参考上一个阶段预上传precreate接口返回的block_list File io.Reader `file:"file"` // 是 RequestBody参数 上传的文件内容 }对于总体积大于4mb的文件,通过切片的方式进行上传。 总后是合并文件写入文件逻辑:func (r *Bcloud) FileUploadSessionFinish(req *FileUploadSessionFinishReq) error { token, err := r.getAuthToken() if err != nil { return err req.Method = "create" req.AccessToken = token req_, err := req.to() if err != nil { return err resp := new(fileUploadSessionFinishResp) err = r.requestURLEncode(http.MethodPost, "https://pan.baidu.com/rest/2.0/xpan/file", req_, resp) if err != nil { return err } else if err := resp.Err(); err != nil { return err return nil type FileUploadSessionFinishReq struct { Method string `query:"method"` AccessToken string `query:"access_token"` Path string `json:"path"` File io.Reader `json:"-"` UploadID string `json:"uploadid"` RType *int64 `json:"rtype"` }至此,完成了文件上传的三个阶段:预上传、分片上传、创建文件。开源发布Publish我们知道在 Golang的项目中,可以 import 一个托管在远程仓库的模块,这个模块在我们使用 go get 的时候,会下载到本地。既然是放在远程仓库上,意味着所有人都可以发布,并且所以人也都可以使用,所以为了让乡亲们更方便地上传数据到百度网盘,让我们把这个项目开源。先在你的 Github 上新建一个仓库,记得选 Public(公开项目),随后将项目代码推送到Github上面:echo "# bdyp_upload_golang" >> README.md git init git add README.md git commit -m "first commit" git branch -M main git remote add origin https://github.com/zcxey2911/bdyp_upload_golang.git git push -u origin main在项目根目录使用go mod init 命令进行初始化,注意这里的模块名,填写我们的git仓库名称,但是不要带着.git:go mod init github.com/zcxey2911/bdyp_upload_golang再次推送项目模块代码:git add -A git commit -m "Add a go mod file" git push -u origin main全部完成以后,刷新我们的仓库,就可以看到我们的刚刚上传的项目代码了,点击 release 发布一个版本即可。 最后,通过go get命令安装发布之后的模块:go get github.com/zcxey2911/bdyp_upload_golang完整的调用流程:package main import ( "fmt" bdyp "github.com/zcxey2911/bdyp_upload_golang" func main() { var bcloud = bdyp.Bcloud{} // 获取token res, err := bcloud.GetToken("oob获取的code", "oob", "应用appkey", "应用appsecret") fmt.Println(res) if err != nil { fmt.Println("err", err) } else { fmt.Printf("接口的token是: %#v\n", res.AccessToken) // 读取文件 f, err := os.Open("/Users/liuyue/Downloads/ju1.webp") if err != nil { fmt.Println("err", err) return defer f.Close() // 上传文件 print(bcloud.Upload(&bdyp.FileUploadReq{ Name: "/apps/云盘备份/ju2.webp", File: f, RType: nil, }查看上传的数据: 简单快速,一气呵成。结语当然了百度云盘备份也不是没有缺陷,将数据存储在云端可能会存在安全性和隐私性问题,与此同时,数据量很大或者数据分布在不同地点的情况下,恢复数据所需的时间会比较长。不差钱的同学也可以选择磁盘快照服务,最后奉上项目地址,与君共勉:https://github.com/zcxey2911/bdyp\_upload\_golang

前端已死?全栈当立?取法于中,仅得其下。

开篇明义,前端已死?根本就是扯淡。前端技术精微渊深,驳杂宽广,除了基础的 HTML、CSS 和 JavaScript 技术外,前端技术还涉及到许多其他相关技术和工具,比如前端框架、UI 库、自动化构建工具、代码管理工具等等。这些技术并没有死,反而生态圈愈发健壮,但为什么前端已死的论调甚嚣尘上?前端市场萎靡前端技术并未消亡,但前端工程师的坑位却逐年减少,为什么?是由于竞争加剧、市场饱和、经济衰退等多种因素导致的。每年都有海量的应届生进入市场,但是岗位就那么多,三年经济下行,不是短时间能够缓过来的,所以前端岗的HeadCount比往年少也是合乎逻辑的,再者说,所谓出来混,迟早都要还,从2015年开始,前端岗市场就是一片蓝海,大部分人都吃到了前端市场的红利,但谁也不能保证一直在风口,所以蓝海变红海,也符合市场规律。 此外,从技术层面来看,前端市场萎靡有下面几个原因:技术迭代快:前端技术在不断更新和迭代,新的技术和框架层出不穷。对于企业而言,要求前端工程师能够跟上技术的发展,并且具备不断学习和创新的能力,因此前端岗位的技术要求也会相应变高。 工具化、标准化:前端开发工具和标准化规范不断更新和完善,如Node.js、Webpack、ESLint、TypeScript等,前端工程师需要具备使用和运用这些工具和规范的能力。这也使得企业在招聘前端工程师时,更加注重前端工程师的技术基础和工具应用能力。 设计和交互要求提高:现在的前端开发要求不仅仅是实现静态页面和基本交互,更需要结合设计和交互,实现复杂的页面和动态效果。这对前端工程师的设计和交互能力也提出了更高的要求。 全栈工程师的兴起:全栈工程师是指具备前后端开发能力的工程师,他们不仅能够开发前端,还能够处理后端业务逻辑和数据库等技术。在一些公司中,他们更倾向于招聘全栈工程师,而非仅仅只招前端工程师,说白了,前后端分离项目,只招一个全栈的成本明显比招一个前端和一个后端的成本要低得多。 人工智能等新技术的涌现:随着人工智能、大数据、云计算等新技术的涌现,企业对前端工程师的需求也会发生变化。前端工程师不仅需要具备前端技术方面的能力,还需要了解其他相关技术,如机器学习、数据可视化等,ChatGPT的风靡也恰如其分的说明了这一点。其他行业类比一些岗位的衰退甚至消亡,都有其背后的深层次原因,类比的话,目前前端岗有点类似足球行业的古典前腰位置,古典前腰位置指的是足球比赛中的前腰球员,通常在球队阵容中处于前场中央位置,负责组织进攻和创造得分机会。这个位置在过去的足球比赛中非常重要,但随着现代足球的发展,它逐渐消失了。 其中的一个原因是足球比赛的战术和风格发生了改变。在过去,球队的阵容通常是4-4-2或者4-3-3这样的传统阵型,其中前腰球员有着非常重要的位置。然而,现代足球比赛中,球队更多的采用了4-2-3-1或者4-1-4-1这样的阵型,前腰球员的作用被更多地分摊到了其他球员身上。 另一个原因是现代足球比赛中球员的身体素质要求越来越高,运动员需要具备更好的体能、速度和耐力。随着比赛节奏的加快,球员需要更快地反应并更加活跃地在场上奔跑。这也意味着更多的球员需要参与到防守和进攻中,而前腰球员的作用也逐渐减少。 此外,现代足球比赛中的技术和战术变化也导致前腰球员的角色发生了改变。如今,球队更多地依靠侧翼球员和边后卫来制造得分机会,而前腰球员的作用则变得更加多样化,需要具备更全面的技术和战术素养。 比如曾经的世界杯金球奖获得者,哥伦比亚传奇前腰哈梅斯·罗德里格斯,江湖人称J罗,2014年巴西世界杯后,西甲豪门皇家马德里斥八千万欧元的巨资将其引进,一时风光无两,但今时今日,正值当打之年的J罗却混迹在欧洲末流的希腊球会,泯然众人矣。所以,足球比赛的发展和变化是导致古典前腰位置消亡的主要原因之一。虽然这个位置已经不再像过去那样重要,但是球员的多样化角色和更加全面的技能要求使得现代足球比赛更加具有挑战性和趣味性,同样地,如果想在前端岗位保持竞争力,就需要增加其他业务层面上的技能,或者展示出能够在业务上独挑大梁的多面手特性。如何破局虽然前端岗位减少,竞争加剧,但这并不是世界末日,除了前文提到的转型全栈工程师,变身行业多面手,作为前端工程师,也可以选择在前端这个技术栈上持续精进。 金庸先生的传世名作《神雕侠侣》中,有一段情节是杨过在深山中找到了一代剑魔独孤求败的“剑冢”,其中刻着这样一段话:剑魔独孤求败既无敌于天下,乃埋剑于斯。呜呼!群雄束手,长剑空利,不亦悲夫!独孤求败于此葬下了其一生所用的四把剑,其中第二柄为“衣冠冢”,只有描述而无实物。 事实上,剑冢所葬四柄剑,就代表了四个不同的前端技术阶段。 第一把剑是一把青光闪闪的无名利剑:凌厉刚猛,无坚不摧,弱冠前以之与河朔群雄争锋。独孤求败弱冠之前所用的这把剑就和他的少年心性一般,年轻气盛,锐不可当,好勇斗狠,争强好胜,但自身技术还欠打磨,也就是我们刚刚入门前端的阶段,也许已经熟练掌握了某一个前端库,比如JQuery,但JQ却已经并不足以让我们竞聘上任何一个前端岗,所以,只能与河朔群雄争锋,而不是技盖群雄。第二把剑是久历江湖之后,在恶臭的职场浸染了以后,能否还能保持初心,即进入到了“修心”的境界:紫薇软剑,三十岁前所用,误伤义士不祥,乃弃之深谷。是的,不忘初心,追求技术的纯粹性,不会因为环境或者其他原因而轻易改变之前的那个少年。 第三把就是誉满全球的玄铁重剑:重剑无锋,大巧不工。四十岁前恃之横行天下。这是独孤求败四十岁之前所用的兵刃,天下已无抗手,无人能出其右。类比的话,作为前端工程师,我们已褪去了年轻时候的锋芒毕露,不再争论那个框架更好,而是将目光投入更底层的算法和数据结构。 第四把剑却是一把剑柄已经腐烂的木剑:四十岁后,不滞于物,草木竹石均可为剑。自此精修,渐进于无剑胜有剑之境。独孤求败从与人争胜变为了与己争胜,正在开辟一条没有人走过的剑道。是的,正如前端界的独孤求败:尤雨溪(Evan You),早已超凡入圣,研发出大道至简,重剑无锋的Vue.js框架之后,同样自此精修,渐进于无剑胜有剑之境,以前端技术傲睨一世,挟博纵辩,务欲胜人,所作亦颇博丽窈渺,声名甚著。试问,如果我们达到了“木剑”的境界,你还会在乎什么所谓的“前端已死”吗?前端死不死,Web亡不亡,都已经和你没有任何关系了,因为江湖上全部都是你的传说,你也将发出:“呜呼!群雄束手,长剑空利,不亦悲夫!”的慨叹。结语前端未死,前端技术仍在,市场凋敝,岗位要求变高。但那又如何呢,独孤前辈的事迹在激励着我们,与其悲鸣,不如精修,临渊羡鱼,不如退而结网,所谓技术,心有拘囿,便不能纯。 最后,用古人先贤的传世名句和诸位前端同僚共勉:前端犹如西山日,岗位终如草上霜,半世风流半世僧,看似无情胜有情。

旧酒换新瓶,新版M1/M2芯片Macos系统(Ventura)安装古早版本Python2.7(Python2.x)

向下兼容特性是软件开发系统的一个重要指标,它是指一个新的系统或者软件能够与旧的系统或软件兼容并正常运行。这意味着旧系统或软件可以在新系统或软件中使用,而不会出现问题。向下兼容对于提高软件或系统的可用性非常重要,因为它允许用户在不更换旧系统或软件的情况下使用新系统或软件。 我们知道MacOS系统从Monterey12.3版本起就移除了系统内置的Python2,更不消说最新的Ventura13.1了,但有时候我们依然需要古早版本的Python2.x来维护或者更新一些“祖传项目”,不得不承认,这类低版本的“祖传项目”在各种中大型企业内可谓是层出不穷,那么在最新的Ventura13.1系统中,就得重新安装Python2版本。Docker构建Python2最简单的方式是通过Docker镜像来构建Python2开发环境,通过使用容器,开发者可以轻松地将开发环境与应用程序隔离开来,这有助于避免依赖冲突和版本混乱。 直接拉取Python2的Docker镜像文件:docker pull python:2.7.18-slim-stretch随后运行进入Python2命令行:docker run -it --name python2 python:2.7.18-slim-stretch程序返回:➜ qiniu_async docker run -it --name python2 python:2.7.18-slim-stretch Python 2.7.18 (default, Apr 20 2020, 20:08:54) [GCC 6.3.0 20170516] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>>当然了,构建开发环境并非只需要开启命令行,还需要通过pip安装一些古早版本的三方依赖,此时退出Python2命令行:exit()随后查看Python2的容器id➜ ~ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 41ef4af5169d python:2.7.18-slim-stretch "python2" 8 minutes ago Up 5 minutes python2 ➜ ~进入容器内终端:docker exec -it 41ef4af5169d /bin/sh此时,就可是使用pip命令来安装一些老版本的软件了,比如说Django:pip install django@1.11.29程序返回:DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support Collecting django Downloading Django-1.11.29-py2.py3-none-any.whl (6.9 MB) |████████████████████████████████| 6.9 MB 8.9 MB/s Collecting pytz Downloading pytz-2022.7-py2.py3-none-any.whl (499 kB) |████████████████████████████████| 499 kB 20.7 MB/s Installing collected packages: pytz, django Successfully installed django-1.11.29 pytz-2022.7 WARNING: You are using pip version 20.0.2; however, version 20.3.4 is available. You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command. # pip list DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support Package Version ---------- ------- Django 1.11.29 pip 20.0.2 pytz 2022.7 setuptools 44.1.0 wheel 0.34.2安装成功后,退出容器,然后提交更改:# exit ➜ qiniu_async docker commit 41ef python:2.7.18-slim-stretch sha256:119b30be68c806bdd4c74ffa3da115ba6ab144a91664a13e728c529c1fd5bca8如此,就算容器被销毁,再次通过镜像启动容器,也可以保留安装的老版本软件。HomeBrew安装虚拟环境构建Python2如果不想使用Docker,也可以考虑Python虚拟环境,它是在主机操作系统之上的一个独立的目录,其中包含一套完全独立的Python解释器和一组库和软件包。它可以在不影响其他项目的情况下,为单个项目创建一套特定的运行环境。 首先安装HomeBrew:/bin/zsh -c "$(curl -fsSLhttps://gitee.com/cunkai/HomebrewCN/raw/master/Homebrew.sh)”随后执行清理和升级:brew cleanup && brew update安装虚拟环境:brew install pyenv随后安装需要的Python2版本:pyenv install 2.7.18接着添加环境变量:echo 'PATH=$(pyenv root)/shims:$PATH' >> ~/.zshrc之后就可以开启虚拟环境了:pyenv init注意重启终端后,就可以切换Python版本了:pyenv shell 2.7.18结语Docker容器和Python虚拟环境都可以让MacOs系统做到向下兼容开发和维护古早项目,虚拟环境主要用于在同一台机器上管理多个Python项目的依赖关系,而Docker容器则更适用于在不同的机器之间迁移应用程序和环境,可以在任何支持Docker的机器上运行该容器,而无需考虑底层操作系统的差异。

新版以太坊Ethereum库ethersV5.0配合后端Golang1.18实时链接区块链钱包(Metamask/Okc)以及验签操作

区块链去中心化思想无处不在,比如最近使用个体抗原自检替代大规模的中心化核酸检测,就是去中心化思想的落地实践,避免了大规模聚集导致的交叉感染,提高了检测效率,本次我们使用Ethereum最新的ethersV5.0以上版本链接去中心化区块链钱包,并且通过后端Golang1.18服务进行验签。在之前的一篇文章:青山不遮,毕竟东流,集成Web3.0身份钱包MetaMask以太坊一键登录(Tornado6+Vue.js3)中,我们使用的是ethersV4.0版本链接Metamask钱包,后端使用基于Python3.10的Tornado6.0框架,为了避免同质化,这里换成Okc钱包,客户端插件安装地址:https://chrome.google.com/webstore/detail/okx-wallet/mcohilncbfahbmgdjkbpemcciiolgcge前端链接浏览器钱包首先卸载Vue2.0项目:npm uninstall vue-cli -g这里node版本要在8.9以上,npm版本要在6以上;随后安装Vue3.0以上版本:npm install -g @vue/cli然后安装pnpm:npm install -g pnpmpnpm解决了传统npm的node\_modules依赖困境,主要通过软链接和硬链接的结合使用,最终达到节省磁盘空间,安装速度快,严格高效等目的,这里推荐使用pnpm进行包管理。接着,在当前项目中安装ethers库:pnpm install ethers@5.7.2 --save注意这里版本要求v5.0以上。根据ethers5.4官方文档所述:https://docs.ethers.io/v5/getting-started/#getting-started--connecting-rpcethers5.0版本支持异步async操作,提高了效率,async函数就是使用async关键字声明的函数。它是 AsyncFunction 构造函数的实例,并且其中允许使用 await 关键字。async 和 await 关键字让我们可以用一种更简洁的方式写出基于 Promise 的异步行为,而无需刻意地链式调用 promise。声明异步链接方法://链接逻辑 connect:async function(){ },随后请求链接当前的区块链钱包,并且异步获取公钥地址:const provider = new ethers.providers.Web3Provider(window.ethereum); const accounts = await provider.send("eth_requestAccounts", []);打印钱包地址:console.log(accounts);如图所示:这里已经打印出了okc钱包的公钥地址,随后生成签名:const signer = provider.getSigner(); var rightnow = (Date.now()/1000).toFixed(0) console.log(rightnow); signer.signMessage("Signing in at "+rightnow) .then((signature) => { //打印签名和公钥 console.log(accounts[0],signature); });这里通过provider对象获取签名者对象signer,接着调用signMessage方法来进行签名操作,加签算法采用最简单的字符串+时间戳的形式。前端返回签名和公钥地址:0x5cae6c39a56d99d68e7a20c76da0ec387e34249b 0x1093b6dc7c6ae1340b2ebcf819dac1a7160b69a2abbb14d86a0696bd96d6b36923d5f3f82588f30a9353b327014338f51d4e7a90baa8052791a8017f156b57511c后端Golang验签验签的目的很好理解,如果在链接钱包的一瞬间,客户端被监听的其他软件恶意篡改公钥地址,那么很可能会给客户造成不可挽回的经济损失,所以暴露在前端的一切数据都需要后端进行校验,之前我们采用的是Python3.10版本进行验签操作:from web3.auto import w3 from eth_account.messages import defunct_hash_message import time public_address = "0x5cae6c39a56d99d68e7a20c76da0ec387e34249b" signature = "0xc7b06789e6710652d8540487055e0e75918c9c4366ec47c9e7008760df1dedd6506a908f466e448481afed3fe009bbdbfdfa16c28585eff68be54d600083d4251b" #rightnow = int(time.time()) rightnow = 1670142219 print(rightnow) original_message = 'Signing in at {}'.format(rightnow) message_hash = defunct_hash_message(text=original_message) signer = w3.eth.account.recoverHash(message_hash, signature=signature) print(signer)程序返回:1670142219 0x5cAE6c39A56d99d68e7A20c76da0ec387e34249b这里通过签名反向解析出了公钥地址,并且和前端获取的地址保持一致。下面我们采用Golang1.18版本来验签,看看有什么不一样,首先安装Golang1.18,请移步:兔起鹘落全端涵盖,Go lang1.18入门精炼教程,由白丁入鸿儒,全平台(Sublime 4)Go lang开发环境搭建EP00随后安装基于Golang的Ethereum库:go get github.com/storyicon/sigverify根据官方文档指引:https://github.com/storyicon/sigverify构建main.go文件:package main import ( "fmt" ethcommon "github.com/ethereum/go-ethereum/common" "github.com/storyicon/sigverify" func main() { valid, err := sigverify.VerifyEllipticCurveHexSignatureEx( ethcommon.HexToAddress("0x5cae6c39a56d99d68e7a20c76da0ec387e34249b"), []byte("Signing in at 1670142219"), "0xc7b06789e6710652d8540487055e0e75918c9c4366ec47c9e7008760df1dedd6506a908f466e448481afed3fe009bbdbfdfa16c28585eff68be54d600083d4251b", fmt.Println(valid, err) // true <nil> }这里sigverify.VerifyEllipticCurveHexSignatureEx方法有三个参数,分别是公钥地址,签名字符集以及前端返回的签名字符串,返回值为valid:➜ mydemo git:(master) ✗ go run "/Users/liuyue/wodfan/work/mydemo/src/mytest.go" true <nil>如果验签通过会返回布尔值:true。至此,后端验签流程就结束了。结语总体而言,前端Ethers采用了ES7新语法async/await实现了重大改进,它提供了一种使用同步代码样式异步链接钱包对象的方式,而且不会阻塞主线程,而后端Golang作为编译型语言验签流程反而比解释型的Python更加简单方便。

含辞未吐,声若幽兰,史上最强免费人工智能AI语音合成TTS服务微软Azure(Python3.10接入)

所谓文无第一,武无第二,云原生人工智能技术目前呈现三足鼎立的态势,微软,谷歌以及亚马逊三大巨头各擅胜场,不分伯仲,但目前微软Azure平台不仅仅只是一个PaaS平台,相比AWS,以及GAE,它应该是目前提供云计算人工智能服务最全面的一个平台,尤其是语音合成领域,论AI语音的平顺、自然以及拟真性,无平台能出其右。本次,我们通过Python3.10版本接入Azure平台语音合成接口,打造一款本地的TTS服务(文本转语音:Text To Speech)。准备工作首先根据Azure平台官方文档:https://learn.microsoft.com/zh-cn/azure/cognitive-services/speech-service/get-started-text-to-speech?tabs=macos%2Cterminal&pivots=programming-language-python在平台上创建免费订阅服务:https://azure.microsoft.com/zh-cn/free/cognitive-services/免费订阅成功后,进入资源创建环节,这里我们访问网址,创建免费的语音资源:https://portal.azure.com/#create/Microsoft.CognitiveServicesSpeechServices这里注意订阅选择免费试用,使用区域选择东亚,如果在国外可以选择国外的对应区域。创建语音服务资源成功后,转到资源组列表,点击获取资源秘钥:需要注意的是,任何时候都不要将秘钥进行传播,或者将秘钥写入代码并且提交版本。这里相对稳妥的方式是将秘钥写入本地系统的环境变量中。Windows系统使用如下命令:setx COGNITIVE_SERVICE_KEY 您的秘钥Linux系统使用如下命令:export COGNITIVE_SERVICE_KEY=您的秘钥Mac系统的bash终端:编辑 ~/.bash\_profile,然后添加环境变量export COGNITIVE_SERVICE_KEY=您的秘钥添加环境变量后,请从控制台窗口运行 source ~/.bash\_profile,使更改生效。Mac系统的zsh终端:编辑 ~/.zshrc,然后添加环境变量export COGNITIVE_SERVICE_KEY=您的秘钥如此,前期准备工作就完成了。本地接入确保本地Python环境版本3.10以上,然后安装Azure平台sdk:pip3 install azure-cognitiveservices-speech创建test.py文件:`import azure.cognitiveservices.speech as speechsdk import os speech_config = speechsdk.SpeechConfig(subscription=os.environ.get('KEY'), region="eastasia")``audio_config = speechsdk.audio.AudioOutputConfig(use_default_speaker=True)`这里定义语音的配置文件,通过os模块将上文环境变量中的秘钥取出使用,region就是新建语音资源时选择的地区,audio\_config是选择当前计算机默认的音箱进行输出操作。接着,根据官方文档的配置,选择一个语音机器人:https://learn.microsoft.com/zh-cn/azure/cognitive-services/speech-service/language-support?tabs=stt-tts#prebuilt-neural-voices 纯文本 wuu-CN-XiaotongNeural1(女) wuu-CN-YunzheNeural1(男) 不支持 yue-CN 中文(粤语,简体) yue-CN 纯文本 yue-CN-XiaoMinNeural1(女) yue-CN-YunSongNeural1(男) 不支持 zh-CN 中文(普通话,简体) zh-CN 音频 + 人工标记的脚本 结构化文本 短语列表 zh-CN-XiaochenNeural4、5、6(女) zh-CN-XiaohanNeural2、4、5、6(女) zh-CN-XiaomengNeural1、2、4、5、6(女) zh-CN-XiaomoNeural2、3、4、5、6(女) zh-CN-XiaoqiuNeural4、5、6(女) zh-CN-XiaoruiNeural2、4、5、6(女) zh-CN-XiaoshuangNeural2、4、5、6、8(女) zh-CN-XiaoxiaoNeural2、4、5、6(女) zh-CN-XiaoxuanNeural2、3、4、5、6(女) zh-CN-XiaoyanNeural4、5、6(女) zh-CN-XiaoyiNeural1、2、4、5、6(女) zh-CN-XiaoyouNeural4、5、6、8(女) zh-CN-XiaozhenNeural1、2、4、5、6(女) zh-CN-YunfengNeural1、2、4、5、6(男) zh-CN-YunhaoNeural1、2、4、5、6(男) zh-CN-YunjianNeural1、2、4、5、6(男) zh-CN-YunxiaNeural1、2、4、5、6(男) zh-CN-YunxiNeural2、3、4、5、6(男) zh-CN-YunyangNeural2、4、5、6(男) zh-CN-YunyeNeural2、3、4、5、6(男) zh-CN-YunzeNeural1、2、3、4、5、6(男) 神经网络定制声音专业版 神经网络定制声音精简版(预览版) 跨语言语音(预览版) zh-CN-henan 中文(中原河南普通话,中国大陆) 不支持 不支持 zh-CN-henan-YundengNeural1(男) 不支持 zh-CN-liaoning 中文(东北普通话,中国大陆) 不支持 不支持 zh-CN-liaoning-XiaobeiNeural1(女) 不支持 zh-CN-shaanxi 中文(中原陕西普通话,中国大陆) 不支持 不支持 zh-CN-shaanxi-XiaoniNeural1(女) 不支持 zh-CN-shandong 中文(冀鲁普通话,中国大陆) 不支持 不支持 zh-CN-shandong-YunxiangNeural1(男) 不支持 zh-CN-sichuan 中文(西南普通话,简体) zh-CN-sichuan 纯文本 zh-CN-sichuan-YunxiNeural1(男) 不支持 zh-HK 中文(粤语,繁体) zh-HK 纯文本 zh-HK-HiuGaaiNeural4、5、6(女) zh-HK-HiuMaanNeural4、5、6(女) zh-HK-WanLungNeural1、4、5、6(男) 神经网络定制声音专业版 zh-TW 中文(台湾普通话) zh-TW 纯文本 zh-TW-HsiaoChenNeural4、5、6(女) zh-TW-HsiaoYuNeural4、5、6(女) zh-TW-YunJheNeural4、5、6(男) 神经网络定制声音专业版单以中文语音论,可选择的范围还是相当广泛的。继续编辑代码:import azure.cognitiveservices.speech as speechsdk import os speech_config = speechsdk.SpeechConfig(subscription=os.environ.get('KEY'), region="eastasia") audio_config = speechsdk.audio.AudioOutputConfig(use_default_speaker=True) speech_config.speech_synthesis_voice_name='zh-CN-XiaomoNeural' speech_synthesizer = speechsdk.SpeechSynthesizer(speech_config=speech_config, audio_config=audio_config) text = "hello 大家好,这里是人工智能AI机器人在说话" speech_synthesis_result = speech_synthesizer.speak_text_async(text).get()这里我们选择zh-CN-XiaomoNeural作为默认AI语音,并且将text文本变量中的内容通过音箱进行输出。如果愿意,我们也可以将语音输出为实体文件进行存储: import azure.cognitiveservices.speech as speechsdk import os speech_config = speechsdk.SpeechConfig(subscription=os.environ.get('KEY'), region="eastasia") audio_config = speechsdk.audio.AudioOutputConfig(use_default_speaker=True) file_config = speechsdk.audio.AudioOutputConfig(filename="./output.wav") speech_config.speech_synthesis_voice_name='zh-CN-XiaomoNeural' speech_synthesizer = speechsdk.SpeechSynthesizer(speech_config=speech_config, audio_config=file_config) text = "hello 大家好,这里是人工智能AI机器人在说话" speech_synthesis_result = speech_synthesizer.speak_text_async(text).get()这里指定file\_config配置为脚本相对路径下的output.wav文件:ls output.wav如此,音频文件就可以被保存起来,留作以后使用了。语音调优默认AI语音听多了,难免会有些索然寡味之感,幸运的是,Azure平台提供了语音合成标记语言 (SSML) ,它可以改善合成语音的听感。根据Azure官方文档:https://learn.microsoft.com/zh-cn/azure/cognitive-services/speech-service/speech-synthesis-markup通过调整语音的角色以及样式来获取定制化的声音:语音 样式 角色 en-GB-RyanNeural1 cheerful, chat 不支持 en-GB-SoniaNeural1 cheerful, sad 不支持 en-US-AriaNeural chat, customerservice, narration-professional, newscast-casual, newscast-formal, cheerful, empathetic, angry, sad, excited, friendly, terrified, shouting, unfriendly, whispering, hopeful 不支持 en-US-DavisNeural chat, angry, cheerful, excited, friendly, hopeful, sad, shouting, terrified, unfriendly, whispering 不支持 en-US-GuyNeural newscast, angry, cheerful, sad, excited, friendly, terrified, shouting, unfriendly, whispering, hopeful 不支持 en-US-JaneNeural angry, cheerful, excited, friendly, hopeful, sad, shouting, terrified, unfriendly, whispering 不支持 en-US-JasonNeural angry, cheerful, excited, friendly, hopeful, sad, shouting, terrified, unfriendly, whispering 不支持 en-US-JennyNeural assistant, chat, customerservice, newscast, angry, cheerful, sad, excited, friendly, terrified, shouting, unfriendly, whispering, hopeful 不支持 en-US-NancyNeural angry, cheerful, excited, friendly, hopeful, sad, shouting, terrified, unfriendly, whispering 不支持 en-US-SaraNeural angry, cheerful, excited, friendly, hopeful, sad, shouting, terrified, unfriendly, whispering 不支持 en-US-TonyNeural angry, cheerful, excited, friendly, hopeful, sad, shouting, terrified, unfriendly, whispering 不支持 es-MX-JorgeNeural1 cheerful, chat 不支持 fr-FR-DeniseNeural1 cheerful, sad 不支持 fr-FR-HenriNeural1 cheerful, sad 不支持 it-IT-IsabellaNeural1 cheerful, chat 不支持 ja-JP-NanamiNeural chat, customerservice, cheerful 不支持 pt-BR-FranciscaNeural calm 不支持 zh-CN-XiaohanNeural5 calm, fearful, cheerful, disgruntled, serious, angry, sad, gentle, affectionate, embarrassed 不支持 zh-CN-XiaomengNeural1、5 chat 不支持 zh-CN-XiaomoNeural5 embarrassed, calm, fearful, cheerful, disgruntled, serious, angry, sad, depressed, affectionate, gentle, envious YoungAdultFemale, YoungAdultMale, OlderAdultFemale, OlderAdultMale, SeniorFemale, SeniorMale, Girl, Boy zh-CN-XiaoruiNeural5 calm, fearful, angry, sad 不支持 zh-CN-XiaoshuangNeural5 chat 不支持 zh-CN-XiaoxiaoNeural5 assistant, chat, customerservice, newscast, affectionate, angry, calm, cheerful, disgruntled, fearful, gentle, lyrical, sad, serious, poetry-reading 不支持 zh-CN-XiaoxuanNeural5 calm, fearful, cheerful, disgruntled, serious, angry, gentle, depressed YoungAdultFemale, YoungAdultMale, OlderAdultFemale, OlderAdultMale, SeniorFemale, SeniorMale, Girl, Boy zh-CN-XiaoyiNeural1、5 angry, disgruntled, affectionate, cheerful, fearful, sad, embarrassed, serious, gentle 不支持 zh-CN-XiaozhenNeural1、5 angry, disgruntled, cheerful, fearful, sad, serious 不支持 zh-CN-YunfengNeural1、5 angry, disgruntled, cheerful, fearful, sad, serious, depressed 不支持 zh-CN-YunhaoNeural1、2、5 advertisement-upbeat 不支持 zh-CN-YunjianNeural1、3、4、5 Narration-relaxed, Sports_commentary, Sports_commentary_excited 不支持 zh-CN-YunxiaNeural1、5 calm, fearful, cheerful, angry, sad 不支持 zh-CN-YunxiNeural5 narration-relaxed, embarrassed, fearful, cheerful, disgruntled, serious, angry, sad, depressed, chat, assistant, newscast Narrator, YoungAdultMale, Boy zh-CN-YunyangNeural5 customerservice, narration-professional, newscast-casual 不支持 zh-CN-YunyeNeural5 embarrassed, calm, fearful, cheerful, disgruntled, serious, angry, sad YoungAdultFemale, YoungAdultMale, OlderAdultFemale, OlderAdultMale, SeniorFemale, SeniorMale, Girl, Boy zh-CN-YunzeNeural1、5 calm, fearful, cheerful, disgruntled, serious, angry, sad, depressed, documentary-narration OlderAdultMale, SeniorMale这里将语音文本改造为SSML的配置格式:import os import azure.cognitiveservices.speech as speechsdk speech_config = speechsdk.SpeechConfig(subscription=os.environ.get('KEY'), region="eastasia") audio_config = speechsdk.audio.AudioOutputConfig(use_default_speaker=True) file_config = speechsdk.audio.AudioOutputConfig(filename="./output.wav") speech_config.speech_synthesis_voice_name='zh-CN-XiaomoNeural' speech_synthesizer = speechsdk.SpeechSynthesizer(speech_config=speech_config, audio_config=file_config) #text = "hello 大家好,这里是人工智能AI机器人在说话" #speech_synthesis_result = speech_synthesizer.speak_text_async(text).get() text = """ <speak version="1.0" xmlns="http://www.w3.org/2001/10/synthesis" xmlns:mstts="https://www.w3.org/2001/mstts" xml:lang="zh-CN"> <voice name="zh-CN-XiaoxiaoNeural"> <mstts:express-as style="lyrical" role="YoungAdultFemale" > <prosody rate="+12.00%"> hello 大家好,这里是刘悦的技术博客 大江东去,浪淘尽,千古风流人物。 故垒西边,人道是,三国周郎赤壁。 乱石穿空,惊涛拍岸,卷起千堆雪。 江山如画,一时多少豪杰。 </prosody> </mstts:express-as> </voice> </speak>""" result = speech_synthesizer.speak_ssml_async(ssml=text).get()通过使用style和role标记进行定制,同时使用rate属性来提升百分之十二的语速,从而让AI语音更加连贯顺畅。注意这里使用ssml=text来声明ssml格式的文本。结语人工智能AI语音系统完成了人工智能在语音合成这个细分市场的落地应用,为互联网领域内许多需要配音的业务节约了成本和时间。

MacOs13 Ventura(M1/M2芯片) + Parallels Desktop 18(PD18史上最强虚拟机)永久免费使用攻略

众神殿内,高朋满座,胜友如云,Vmware、VirtualBox、Utm等虚拟机大神群英荟萃,只见位于C位王座上的Parallels怅惘抬头,缓缓逡巡,睥睨群小,目光到处,无人敢抬头对视。是的,如果说虚拟机领域有一位王者,非Parallels不能领袖群伦,毕竟大厂背书,功能满格,美中不足之处就是价格略高,但这也并非是Parallels的错,因为市场上没有任何一款虚拟机产品在产品力层面能和Parallels抗衡,本次我们在最新的MacOs13 Ventura(M1/M2芯片)系统下永久使用Parallels Desktop 18.1.0版本。首先升级最新的MacOs 13 Ventura 13.01系统:随后去Parallels官网下载18.1正式版:https://download.parallels.com/desktop/v18/18.1.0-53311/ParallelsDesktop-18.1.0-53311.dmg随后双击进行安装,安装成功之后,设置取消自动更新:随后,参考somebasj大神的永久使用方案:https://git.icrack.day/somebasj/ParallelsDesktopCrack终端运行命令:git clone https://git.icrack.day/somebasj/ParallelsDesktopCrack.git进入到目录:ParallelsDesktopCrackcd ParallelsDesktopCrack接着运行命令:chmod +x ./install.sh && sudo ./install.sh系统返回:➜ ~ cd /Users/liuyue/Downloads/parallelsdesktopcrack ➜ parallelsdesktopcrack chmod +x ./install.sh && sudo ./install.sh Password: [*] Copy prl_disp_service [*] Sign prl_disp_service /Applications/Parallels Desktop.app/Contents/MacOS/Parallels Service.app/Contents/MacOS/prl_disp_service: replacing existing signature [*] Copy fake licenses.json [*] Start Parallels Service [*] Exit Parallels Desktop account ... [*] Disable CEP ... [*] Crack success.至此,即可拥有Parallels Desktop 18的永久使用权:需要注意的是,Parallels Desktop有一定几率监听客户端的用户行为,安全起见,最后修改本地的host文件避免远程监听:sudo vim /etc/hosts添加如下规则:127.0.0.1 download.parallels.com 127.0.0.1 update.parallels.com 127.0.0.1 desktop.parallels.com 127.0.0.1 download.parallels.com.cdn.cloudflare.net 127.0.0.1 update.parallels.com.cdn.cloudflare.net 127.0.0.1 desktop.parallels.com.cdn.cloudflare.net 127.0.0.1 www.parallels.cn 127.0.0.1 www.parallels.com 127.0.0.1 www.parallels.de 127.0.0.1 www.parallels.es 127.0.0.1 www.parallels.fr 127.0.0.1 www.parallels.nl 127.0.0.1 www.parallels.pt 127.0.0.1 www.parallels.ru 127.0.0.1 www.parallelskorea.com 127.0.0.1 reportus.parallels.com 127.0.0.1 parallels.cn 127.0.0.1 parallels.com 127.0.0.1 parallels.de 127.0.0.1 parallels.es 127.0.0.1 parallels.fr 127.0.0.1 parallels.nl 127.0.0.1 parallels.pt 127.0.0.1 parallels.ru 127.0.0.1 parallelskorea.com 127.0.0.1 pax-manager.myparallels.com 127.0.0.1 myparallels.com 127.0.0.1 my.parallels.com保存后,锁定host文件:sudo chflags uchg /etc/hosts #锁定Hosts文件,只读 sudo chflags nouchg /etc/hosts #解锁Hosts文件,读写接着就可以安装自己喜欢的虚拟机系统了,Win11的话,推荐在微软官网进行下载:https://www.microsoft.com/software-download/windows11结语所谓道高一尺,魔高一丈,服务端如果想制止这种永久免费使用行为,就需要更新小版本,而客户端只需不更新即可持续地进行永久使用。

君子不玩物丧志,亦常以借物调心,网站集成二次元网页小组件(widget)石蒜模拟器,聊以赏玩

传世经典《菜根谭》中有言曰:“徜徉于山林泉石之间,而尘心渐息;夷犹于诗书图画之内,而俗气潜消。故君子虽不玩物丧志,亦常借物调心。”意思是,徜徉在林泉山石之间,能够摒弃杂念,留意诗词歌画之中,可以尽弃俗见。所以说君子虽然不会玩物丧志,也常常要借一些优雅的小物件来调理情绪,二次元网页小组件(widget)就是这样的小物件,功能上无甚大观,却可以博君一晒。引入方式根据官方文档:https://github.com/dsrkafuu/sakana-widget ,引入方式分为两种,分别是模板引入:<div id="sakana-widget"></div> <script> function initSakanaWidget() { new SakanaWidget().mount('#sakana-widget'); </script> <script async onload="initSakanaWidget()" src="https://cdn.jsdelivr.net/npm/sakana-widget@2.3.1/lib/sakana.min.js" ></script>这里需要声明一个div容器,随后通过async异步方式引入站外cdn组件库sakana.min.js,随后通过initSakanaWidget方法将其挂载到div容器上。或者通过 NPM 包的形式安装:npm install --save sakana-widget随后在组件内引入:import SakanaWidget from 'sakana-widget'; new SakanaWidget().mount('#sakana-widget');本包内默认导出一个类 SakanaWidget,通过该类可以初始化一个小组件。这里初始化了一个默认设置的组件,同样将其挂载到div容器上。定制化默认设置的组件一律千篇,我们可以对其进行定制化操作:<div id="sakana-widget" style="position:fixed;right:0px;bottom:0px;"></div> <script> function initSakanaWidget() { const takina = SakanaWidget.getCharacter('takina'); takina.initialState = { ...takina.initialState, i: 0.001, d: 1, autoFit:true, SakanaWidget.registerCharacter('takina-slow', takina); </script> <script async onload="initSakanaWidget()" src="https://cdn.jsdelivr.net/npm/sakana-widget@2.3.1/lib/sakana.min.js" ></script>这里首先将div容器通过固定定位(position:fixed)设置到屏幕右下角,固定定位非常有辨识度,那就是无论页面有多长,滚动条怎么滚动,固定的容器都相对于浏览器窗口是静止的,脱离了整个文档流的限制,漂浮在右下角,岿然不动。随后通过getCharacter静态方法获取内置的角色对象,修改参数,并创建一个超慢速无阻尼 (永续) 的泷奈作为新角色,接着通过registerCharacter方法将角色对象注册到组件中。这里i和d参数分别代表惯性和角度,其他类型参数如下:export interface SakanaWidgetState { i: number; s: number; d: number; r: number; y: number; * 垂直速度 t: number; * 水平速度 w: number; export interface SakanaWidgetCharacter { image: string; initialState: SakanaWidgetState; }组件实例构造参数:export interface SakanaWidgetOptions { * 组件大小,默认 `200` size?: number; * 自动适应容器大小 (最小 120px),默认 `false` autoFit?: boolean; * 角色,默认 `chisato` character?: 'chisato' | 'takina'; * 控制栏,默认 `true` controls?: boolean; * canvas 线条设置,默认 `#b4b4b4` & `10` stroke?: { color?: string; width?: number; * 停止动画的阈值,默认 `0.1` threshold?: number; * 旋转角度,默认 `0` rotate?: number; }效果如图所示:除了内置角色,我们也可以自己更换角色的皮肤:<div id="sakana-widget" style="position:fixed;right:0px;bottom:0px;"></div> <script> function initSakanaWidget() { const demo = SakanaWidget.getCharacter('chisato'); demo.initialState = { ...demo.initialState, i: 0.001, d: 1, autoFit:true, demo.image = `https://p3-passport.byteimg.com/img/user-avatar/f2190ea080335c49be09e4072617ea89~100x100.awebp`; SakanaWidget.registerCharacter('demo', demo); new SakanaWidget({ character: 'demo' }).mount('#sakana-widget'); </script> <script async onload="initSakanaWidget()" src="https://cdn.jsdelivr.net/npm/sakana-widget@2.3.1/lib/sakana.min.js" ></script>这里以本站logo为例,通过设置image属性来进行角色皮肤的替换,图像类型不限:除了在线图片地址,也支持base64的图片编码形式。首先通过Golang脚本获取实体图片的base64编码:package main import ( "encoding/base64" "fmt" "io/ioutil" "log" func main() { srcByte, err := ioutil.ReadFile(`/Users/liuyue/logo_pink.png`) if err != nil { log.Fatal(err) res := base64.StdEncoding.EncodeToString(srcByte) fmt.Println(res) }程序返回:iVBORw0KGgoAAAANSUhEUgAAAyAAAAHDCAYAAADcP35tAAAgAElEQVR4nOzdB7Rk11kv+P8+p8LNqft2zi21ck6WZAVbliUHMA6YMQYPmGUevAeY4Dc8zxCeecBiYcKDIdoM8wZsgm2CMbJlWchKVs6p1QqtbnW+Od9b4Zw96/v2qU7qcOucqtN1q/4/6OWW1F23wqmq/e39BTN868d3AlgHYAZERERERET10QVgbwZAH4AsgH4+0UREREREVEd9Hp9dIiIiIiJKCwMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKDQMQIiIiIiJKTYZPNS1pxgDFIlAoud+fjJVw2wD5rPtzXhPE3tYe+VUqA+XgxM+B/Hd53PkcENoj/15+b8NU7/KSkdY1YqPXJwzcNXzKP2uBbAZozx/7OjayMHS/FvPY2nJANut+T8c+N/orumCi97qV93vluSqV3vp8+r77JXwPJpMBstG/k79mouv8VJ+bSS0UTv65VI1cVh+D3mmvjvc3rqAGn6OmQR8bUZ0wAKGlrVyGWd4PdHcBQXDyhyKLyVIJ9sCwfina+QWgvR2ms31pLXjkS0oWH9MzUWCRh8n6MKuWA12db30O9M/kgOlZ2EOjbgErtzE3DxuGMO1tXPC9hYGdmQcyHow8p2EdgjR5DeYLsHNzQC4H09MJs2G1W2Cd7PXI+MDsAuzQqPt9o5JF1Mwc7EIBpi0PdLbDrF/tHvPJHpu8PyenYadmmmNzIC55juR6m1+AXSi638viO+PDaEDhASuWueultwsml9MfZPp73fNWeX7l9/I5NzXt/s7MPKw8v2OTwOycBiAawJTL7pfnwWSjTYpcJgp2EtDAOoRZPQh0dpz6s/lU9JoJ3TUv136x5D7z5X0p9/VMf3ZFn8f6HCZRCRg9LsmodfBqpyXNjk8h85H3IPuJD8FOz558sVhZ/IxPItx7AMEDjyN4+GnYQyMwy/oA2R1s5IV4tDDRBQQsvHO3wr/qYphtm+GvWwX09bjd4+Mfvyws+ntQ/NO/Qflvvw6zYhkwOw/0dSP3kz8E//xtgHypMwhxZPHsZxA88yKKf/UVXUijlkFq5XUcGdfbzbzvZvhvuxze1g1AT9eRE5ET/dXl/Sj/610o/vafu9ex0chjCwLYoXGYgR732K65DN7aVe6xwZ50YWt6u1H8k/+F8j9/B0b+bD135RtNtIC1xbJuDGjgu341vGX9+pp7W9bDrBqEWT4ATwINWdC35WHkJCzjTgVMe/uxAYg8f6Uy7MK8+/eFktt0kV8zcwjHJ2HHJ2D3D+nnoR0eh52Y0s9DjIy5zxIJjCUYkYWxBLzVvCYLC/qa5/7rp+Bt2wIr76M4Kp/bM7Owo+MIX92F4NFnETzzEjA1DTPQF53onKHPLzl97+5C9j99DGZwmXt+4zzM9jYEO3ej/P/9s25MaNBJ1OQYgNDS5nsIDw7BdLS73fzSKXai5Mts1SC8C85B5rabEb7+Bop//VUEdz3o/m5HW2OmtsiieG5BAyz/bZcg8/3vgn/9lTA9vYCNdhZlBy6wLq3iKCbfhmB4GOGjz8J0dWiqgC2Xkf/Z/x3Z226GtQGDj+MZA3/bVl38FT/3x0ChWJsFQWVRODYB/x1vQ/aTH4F/0QXuvwUltxA9Gd2hzrmUlkYk1+hCEXZ2Dpn33oTMT3wU/tYtkoPldr9P9djkecnlYUcn3fu3FYIPCQrkBGxyygUTy/rgb14H74Jt8Davh3fWBpiVgzDd3VHgFr1H5X8l3UdS9uS9XPnXJ1rga0pP9BXf3gZPApfoFMU7+s/oid8M7PConhCH+w4h3HcAdsdOhLv26W0beQ2rOC21pTK8ni73WHp6YPP5ZM/XyuXA2ZvhX3sVsh/7fgTPvYzS330dwX2PwbS1AfLZVo+TytOwcwV4F5yNzG03wsvk3OdpDMb4sEV5/8zDtPIJILUUBiC0tOVzsNtfR3hgyKUhLGaBJl+ixsDbuhlt/+OXULrgbBT/5EvA9BxMd0djBSHGuIVZRxuy/+VHkP34B2DaO2HnZmHHxk5T9yLpV20I7vguwtd2626hnZuHt3kt/MsvgC0VYWdm03w0S4bNZDTI866+GMH9T8AkDUA0VaMMOz6JzA+9F7lf+kkYWXRPjLvr7XSLbt+HDQK9zvW0rpGYKPiYnkX2Ux9F9lMfg/EzsOPji7uTsrsu6ULD41Gef5OS56lQ1PegnDqaNSuQueFKeJedD//CbTCrV7jUIgka5L25sKCnFLVyyk+1jO9OqjZvhC8BSakIzM3BHhxBuGsPyt/4D4TP7nD1R4shAYsEHfk8QkkDi5uCdTz5TJP35hWXwL/4XJS+egeKf/ZlYGJKT9FS30wpl+Et79cALVyYPvUG2Kn09yN86TV9HFg5mNKdJzqzGIDQkiZ50bLwkd07b3AZLBYRgESLPTs2rgFM7oc/BNPbg8J//2M9aahmp6+uPE9zn82aQeR//dPIXHUZwoX5Iwu70yxaJS3DTk+hfNf9gBelUBTLMMv6NaCRHTc6CdlNlVOHNStrU2BqrS6wMx9+N/Kf/VmgVDgSQC5mx7+S07/3YOPVf0hK2fQMMj/2IeR+6hOa9iP1Bos+ychmYYdGdMHbcMFVTRhXfyZpd90d8K+6CP6NV8O76mL4mzfIhxhQLLgNgVOlkdaTFrXPu/TMSjG0n9HUQP+csxA8+SLsI8+4tK9FkjRBOZ2o6SZHJc1P3jtteWTls3vjOhR/9Q+03kWDkLQ2kLSOawFmoB/ItwPzMYNFbRQQuPeAfh7U+o4SNSYGILS0+Z6mCIS79sK/5EIAVXzZRTuSYTiNzPvehXDPfpS++I8wUtx4ptNA5ORjZBxm8zq0/d5n4W/ehHBy0u0kLva+tbcjuONBhNt3ukJ9ITuvXV0wnd2wE7XbXW06sgCTvPmhseRF0fJajk/Cu/x85H7uxzXdyk7NVtfxRk4GpuZh9x1qrABET+gm4L/9SuR++kd1Ia11DFW8f0w242oSpmZcp6ZmIc+N1ARMzcIM9iPzwVvhv+cm+Jeer6eYCMuumUQtukTVkqZ52SjFy8KWZxFuf62694HnwVu7on6LaT11K2g6aeb6q4Ff/zkUf+UPtNi+pjVbpyKfxR1tepKl4v5M2USTpg3jU1p3Q9QqGIDQ0uZngLEJhK/ujvcwTLQ7WVhA9kc+iOCJ5zXVwAz0nrlTEM/ATs5ooXj+V3/GBR+LPPU4LJOBnZ9F+a4Ho+4qnttZ9Q3MyqiAWf65lYp9q6EnDkV34pA0LUh2ZMMQ2Y++F6a/3+3eVtluU076QikSLpUaK0dcOl1JeuCPfhBGcuBPlxZ4In5WT/o04O7pTumO15GeNJa0qFtOPDI/cAsyH3w3/IvO15NIOzUFu3BUeloDvwcl9TDYf9CdzGQXmYYYpbhqc496PjY9DQl1IyVz03WwP3UAxd/9ouu8lkY7WzlNHuiFt3alBpNxaQA+Mgp7aBgmz+Jzah0MQGhp0xpKDxgelb6SbrFdbb5x1JZWFofZD92OgnRYkds4Uws9ab9ZKiH/334e/qUXIaw2D9xaLV4t3/eQBlRafF4pXu3sgFm/ymWEM/g4Kbfgn9QdapMwAJGFqHfhNvjvuBaYn4v3vGd8XaRojVMDBSCyw+/ffDW8S8+HlRO6ah9b5Y8Pj7n0R+lqtJSbIkjTJkk/K5fh334Dsj/8ARd4IHSL+NJp5hU1EnkZ8m3aGUvSVWWhvGiecV2hvFO0Xq4VSQGcn0P24x9E8OQLCO5/3HXHqjM5fTHd/a4jXfE0c25OJZuBHZmA3Tfk6maIWgTbLdDS15ZHODyKUHZQExQLy5eY5GV7Wze6xcIZIp1xZMc0866bdLe06qb8ulCwCL7zoGsjW3lOJDDJZV2x/pnIM19K5ARpbMJ1wEqy4JfFlyxGLzrHNQ+I2aZTCpPlhMDON1YAIo/Pv/xClzoV55qSzYOwrKc7ep0u1eDDRCl7B0fhbVqD/G/+IvL/4zPwLzpPd+j1WiovsQ5f0V2VBhaQRhjVfLbK/kZfXzpLDOO6BModzv74R7TtuL5v6y0I9LTF9Pdp168kpMWwBjHcFKIWwgCEljxpoWt373c97NsS7CAtFLSQ3bvonDOzQJcvn+lZmLM3IfsTH41as1b/xSaddMKXXkHw6DPH7gRGk7TNsgHXxpNOLuPrrq8GDH6Cmosg1IFx3vlnuZa0cRYY0aJcB0nKELlG6hRlAW/D2vhfJVLDJbn8MnuikYcrnorUekxOwc7OIvOJD6Dtz34Tmdvf6YYJSqMLu0RPG9vysDPTCJ56wbW5XSw5Gejv0es+tc8ZSVudmta21v67b3Dpb/V+zmVDR+pcOtoTf1/Y4RGttyFqJQxAaOmTxeLUjA7VSlz1aN2QPz1FSPsLQbq7LBSQ/ch74K1a5aZCx/kSNR7K37pXj/WP2bWUL8l8zp2A1KKzU1PzEO45qNO5Ey2MrZtkrU0A4rYijQIgDbCDBlrMSkOD5X1A/1HzaKql8zAWouL6JZgRLCM6hka0EDn/259B/pd+SjvqaeCx1E48jhZamPYOBN97AuETL8B0dy76r8oEd5mALsMoE6UmVUs3aywyt98E09cdeyhgNbzN62AyCQchGqPzVtz1Uve7TNQwWANCzUFa1u7ZBwTlI9Nz45D5IJLTewY2o2SIm3f2Rvg3XeMmGNsq4ynZkZPTj737Ub7vUXcadNTzYAtF+FvWw1u/RothtdtX05Ci8Xl9DmvRtUqfr6FRoBREeewxb8u6gND09sZvDyoBtsx9kYC0gYpUbbEEb8MqmP7u2AtNrbWZndPTnapqDM60SqH5yBj86y5H7rM/DW/TRpcyuZQDD0SfI73dsIV5lP7+3w63pF40OQHp6wG6OnV2TWqikyh/22Z4l5yH4KGnXUF6vUiQtmpFssBZ7rPMB3rzQHUdDomaAAMQag7WItixE5lTTVxeDEmZWbvK7exOz9VmAvZiaE/5Avx3XQ9v5cp43YRkkZDLo3zPw7Bv7NXBZkeTgENSE4pf+JJbGDdRHYgUi/uXXojMrW93C/UkZMFfmI8W/LlkwWipDLNpDSDpKDEXY1JfoQXx4xMwi+1ElAbpyNXXqw0PbIxUQSUpWKMTOnSvoWpbTqUyeHFqCpmPfx9yn/lPrgOYdKqzjd3V6rSiTQxksij+/hcQPvOyK7KuhpyM9fXAdHS5xgRpkpPdfDv8d16rAUhdybgUKbSXzY+4G14SvExOu8+aJKmeREsQAxBqDrKTtG8YkKFXsmiMWxSoi6puba0YyJdvWgGItDMdHIB/89tcOkuMRYzs9oUjIwjufhCms+OtJ0FaK3MApZdeX7p56SciXXBkEvIv9GhL16Rkwa/DLccno4L++PSUYO1KfT1s3OBYBvVNTLsakLSux8WQhWZPN0ynLDTjD2HTdKXSEjk1iIbP2fl5ZH/2E8j9+Ef1lMyOjroAaqm+paIJ46anR4dkFn7vL1D+2p2uHXm1J8oSxFTaKae9ySHPf7kI/8qLYTasBmSOT0db7X+OXPvSZli7tsXf9NI2xwcOum57jbS5QJQCBiDUFHTRODODcGgE/tZNsbuSyIJRCrf1aH3hWaCnK5WnRxa7mR98D/xtZ8VbzMkCoa0d4bfvQ7j9dTcR+PhFg/yz78Gk9JhSI6cMHe2aj62F3klloun6I2PJA9ByGd5AvwaHdi5uBywfGB3XTkoNMydDriWZFD844P45iFlgb3wNrKT2qa7pMrVQGSw4O4fcf/44sp/8mHbO01qDep/eyO3LDnnG17Q1rUuSDmKLjnisq0eS10lqzeR/dS6Q59pM59r0z4QvbkfxC/+A4IEn3GdItsrOZHLK19l+ZNbQGSBD/bx1a+FfcynKX/pXnVNTa3K9elvXw1sxoKmtseXysAdHYIdr8FlDtMQwAKHmkMvqh7gOjjvvHNctKA7Nw824Qu2k6VyLJfnzA33I3HaTW0DHWcxJelWpiPKd9506Z7sZc4xlgaSLnuWxuoa9hQSzcvohu6dJ6mSiSfuaD5/JAWHMa9IGCKVLVCNtr8s12tUBs25l4pky4Z79bnp6RxWdltIWnXxo8PGLn0T24x8G5mbdXJZ6vKcqAV57m87ikGtAdsntyBRCOTGS4avyeSenMXKdane041J4wtClT8nrJBO7ByRdrkvbxuoJaVsOdq6ktxvufBPl+x9D8MDjbnK7nHz4MYqrJYW1I3ovJjgZSERPXSwyN16F4Fv3us/XWi/ui1H6YV9v/NbaFQeGgbFJYP3qpT0Dh6hKDECoOUiqzPAo7O597uEkKUSXv75lvU4i1wVtPbvzyK7q9AwyN18D/+Jz3PyROOlXnV0oP/wYgme265diS5GFVjYLM9Bfo6DR6mmDFNCaJItLWaS35YDl0SlBnOtRfr6kmMmcgDPRGeFk5Dlvy7kTkLitVjVIDt3iq9qGC2mq1HxI2tXP/IgGH3LaimKxtsGHnlD6uoBHNqepROH+gwif34Hwjb0Id+3V2i47Ow+US+4+yYaJBEHlE6Rtyu215wE/405OpIFBJut+Rn+3nlLYAyOa3ofZWdjpOXc6umLANUyIc73K4r+r3aUnJd0MiPsZLp+pMzPwLjgH3nlnaxvhmp8uyGfDYL/bWCjHnBkl178tu0GzS3kGDlFMDECoeeRyCPcd1AJiTVGIWwdSLsJbs8K1npQv+3oGIFFhsv++dwJtHcD8ePW3IR3AykWUv3mfm6Lec4L0qyYm6SRud7fTpZYkFVpYnayf8Hbk9GPlMngyeT5I0I5UApD9h+J30aoHnSkj7YUH4gd9kuo2M+s6lzVy/nuprN2Vsj/9ceR+/GOws3UIPoyB6erS5yHcvx/Bw09rC1wbBR4azLa3uToBSZmSzmzSUluiNikaP9l9kYDARq+XtHAuL+h1JE0N8OpuN0BSbk8CFGkVDSS6zmwYwvMz2vUtUQcsG20qSRAT57OsHOggRP+GKxA8+AQgp5C1+kyU10JOlNatTnY78rk9X9AUrKS1ZkRLEa96ah5ShL3ngHYU8bq7ktWBrFqhO5Hh1ExdN2aleNq7cBv8t13mFjZV34DVhXfw4g5dtGgHm1bbSZO2n+tWRR2rEgYg0a5reHA4+XAx2SXt7HTpfHGDYeMCLCszSRrpdZWgr7PDpaAk6e4ltS0yNK6BF2BSC5T58G3Ifep/01bPOmW7VsGHZK9JhzQ/g/DVnSjdcQ+C+x+HfXO/+++VIM+L2WnpJKmYdftMk8X5MrcZkGjWUMZDODQMTzZTMjFmMmnAVYZ/7eUuTXBqVtM0a0JO/9rb4K0aTJZmJoHf3DzC3XsbOwAnqhMOIqSmoW1m9xwAJH8/yamFLBx7u3W4WN3rQKyFf9sNMB2dbmFTLVlgeBkE9zwMyKC69gYv5K2HINDZLTrXJGnXnag7mNYSJe3LX3bF8RKAxA2GZWEqE541Na+B2tRK212ZAq01CnGfc6nbklQ3mQHSiDNp5DoYHYd/ybnI/dKnXCAotSq1CD7kOsu5tEF7YAiF//lXWPj530D5r7+m82fk32szjO7OxOmkqZL34pqVbl5N3JMUDbaA4Jv36slT3EGg8r7xNm+Cf8WF7pStVsJouKjWnCUJQFyzC0gBepJhp0RLFAMQah6eK/oNJV3FS7BIqCwOtm6o78Tw2Xl4Z21E9pa3A8V4xaymqwPhnr0agGixc6uRp6xYdrvE2XzyNKWoAxZkirxJ+PEoO6XL+rTwN24+vKks0mUB1UiTwiUAkTkzSQIQ6YA1NAo7Mt5Y7YURnYTNzMEs60f2sz8N09UNyKyGWgQfcl1I++JsDqWv/Tvmf/KzKP0/X4WdnIFZNegKxBup3qcacjI2OACTS3AaqemzJQSPPI3g+Vfccx8nANO/Y+Hfch1Md0ftPsuD6PVbPgAbxK9zkW5mkuopJ+5LZgYOUQ3xqqfmITuWxRLCvQdc69K4ZMJtLg9PFlilUt06R0mqmH/j1TArV7id1apvwGoRZHDPQwhlx769Dv3uG50EHBnPde1BzELvo+nMjUktOI678+ruh9vhNFL/kWQ4XTbjCtCnZxtrl7QcwOvvjU6dYhbXIwAmoh3uRtvhLwdan5L9sQ/BP/9c2ImJZJsaOHJtesuW6RDDhV/7fRR/58+BsWk9NdDic9TgGj6TJOjQrm8x0qYiMhFfu3vtOYDg4afcv4wzpE+bByzAu/hc3Uxyp1fJnxsrs6I2roGRlK5EaWbSgnrYNRFIem0RLUEMQKh5yC6SFPXJoL0k3zTWteQxKwbd4qgOw7Tky1BSvGRyt3a0ifMz8nnYqUmUv/1A6+6gSb3O8n6YFcvjd2M6ii5+xifcYLBEJw5WFxWaJ+4lSKGpzMkYnWisPHE5AZGuYybB+6McIJRi/0ZrDe0ZPXXy334lsh95H+zsbA1S+9znk+nvR/DyK5j/qV9F8I17YHp73JT8Zqjb0rbTJXjShU+vi5iPSTYBxic15S18bjvC3W/qSW8cOq9j2XL4113hhtTW4mmWa1/Sr2RuTaJTFQ/hnoOuqxmnoFMLYgBCzcX3ER4agZ2eSrZgK5fgrVsBs3al6yxVS1F6h3/jVfC2bdHFbrzWu53aKSfc/pobGtaCpLZCaizMsl53WpWUn4U9GC34c0lTnsyRAuJYf91dEzrnQVK4/AZZqMt9ketNFmFxyWmlnDIcGDrDD+Y48pzPLcAM9GjXKxkUh0INZn14Rrsylb/3KOY//Ruwr+6CWbsq3qyNRiV1Tr1dR4ZTxiVBt7z/5uZhh8YRPPSkayUchwSOYRn+9VfCrB7U1zbZfXNpvt7gMphckvTD6L0tM0DkeWvG+UxEp8EAhJqLFKLL7pkU9uUTBCDRcEApBLXFGgcgsoDryCNzy3XxT2o0TWEO5TsfcIuYVt1BkwCku1N3km0tAhBZFEjKUy12JWXROdAX/zWWLjmlgkv/0Zz6ZHenZqTmZlkfvME+oBTzvSELrnIZ4ZsHG2vxJS2YJTXythvhX3ienjAmvn/GBR/B40+j8Cu/D4xPRZPCm6tbnXxOyuMyMkckyXUhtzU57U5QPPGsYQkAACAASURBVKPDETUl0o+xXNE5S7NaayfdBq0Ek0kEUQH6YILZPog+v2enYScn3XubqAUxAKGmIhN/ZRhhuGufK0qOSb5MJVfbDC5zwUgN2fEp+G+73LXenY7ZererG8HjzyJ4/Dk3PKxVh1hJEWhPl0tlSTr4TLvvBFFb2ITpTjoDZLkWMce+X57vcuElIGqgAnTtgNXZCfT06O9jkfx36XAk138jpQ/Oz8OsX43sD73fpUYm7YIn71UJPp59AQuf/TwwPe/qlRpppkutyOwNmbTe0x3/upAARGbmjI4dHsqogxif3wHT0xt7OKLJt8O/5Xr3vCcJKOV9vWq5q+1KMNvHVNLMGrEBA1FKGIBQc5HFjAwOGxl1Dyvul02lDmTtitoWyUpr11wWmXfJ6YcfL20o2pkP7npQF0wN1R0pTZXJ0dIuGV7yDWXfg12ozYJfZ8msGHAzHmK24JWic8kPt/uHaz/JOYlKe2EpNo772GTux9ikPt8Nc3onu+VTM8jcfiO8jRu0CD3RYlWaWfT3Ijx4EMXf/BN38iETwpsx+EB0GtneDkjr4AQBiKRVhiNj7rO8La/dwcrfe8KdGMW9VkpFZK64EN6mNdrQIS4Nvnt7opPxBBtT0t1udEJTsBqyBTVRChiAUHOJWujqRPTiQsJ2vAH8rRuTzTo4mny5Su3HBWdpUaSdm4tX+9Hbg2D7Dq3/kEFwLSuaxq3dypIMBKuQBc/cAsJDo8kH4+kJ2oDW5thyzIWKpJxMTrmp7I00qE+edxk2J12b4k67zmQRjk1oqptplABE7svK5RqAaEODJJsOcvIhp7GlEgq/8+cIX9nlJo0380mlnDRIDUh7R/yTI/k8LBVh3zzgrn95Hnu7ET74BMI398G0xTvVljbWZtUqeFddkmwmiGwsyABOOdlMdDLuaW2XnZjhFHRqWQxAqPlIfq10wpK2i0nSO+QLdfXKqFC0Bs+SBDHFIvzbb4Lp63fFrTEF335AU7la+vhesimyPszKwZrcnPbln5mF3XfQ7dAnIScDckLg5+Iv0j0PoaRoyGKuUcokNOjLwNu8Ltli2nhuBojk+mcbIwCROgP/usvgbd3i0sOSkM+M9g6U/vorCP/jYdeMoJmDj+i6wLpVyToQSgcy6WQ4PH7kszuf0zbjwWNPA/m22GlY0vbZf8fbXF1W3ODBQIvZTVvCAvSw7FrwZpuoCQFRlRiAUPPRAU9jLpc/0UT00O1a9nUnry+Q72iZ5bBhjbb3tIUY3VhkwS050fsOoHzfo0BHzC/jZmK82u0sSwqWLIjnC4kDV7TlokLjBLXGcmI2Nq7dohqmULsy30Se87gOd/caddP/G6EGRF6zfB7+zW9zDzLJ9aQnAb1adF7+hzvcKWWzT7quBKZanB3/tFhPw2Tgo3zeVk6vo7Ta4HtPwcpJZ5zNAXkvzc7DP3crvG2btLaqapXHuGlt/CGLiOKzIEA4NBr/NoiaAAMQajq6kz0753ayE5wQaL7vQC+8rethk7bijdo3Zm69Ht7q1UCcNAD54sq3ofydB2Bf3e0KgVtZOWr7KQu8WqTIeW4yceLhk1KM29XpZoAkWIxpO9L9Q26RnnQqe81YfX/p3JW45P1ZLuqsjUaZbWLnFuCduwXeJefr8Lokr7/UQdhSAcW//DsX0LbCRoGkSmWz0TyeBI81k3GT/+cXYI4KTCUNK3jqBYTPvaSF7rFIjUpvH/x3XOtOx6t9jSvB99qEpzxwdS5255sJboNo6WMAQs1Hh8lNInh1N+AlOAGRhaTk+0qKT9JWvAuue0rm3Tfq8XusBU57G+z4OMJ7Hga62lt+eq4Weq9ZoW14UapNDYgO/Uu6Ky/BkBRpy25wzCLtCrv3UHRK0EAnIJ4Hr78v/m1Ibr/MeKhFt7Fakda7l50Pb2AgeavWjg6U//1uhE++4GoFWuWUUtKT+nuStRfOZLQA/S3D+WQjSYrRH4omo8f5/JTXISjBv+YSN99pptpNIKtBkXZGTEKaS0zNHJtmRtSCWP1EzUc+1AtF2IMj7qGZmJOoJXffz7r87SQzJuT4f3wCmXfeBm/bWbBTMfLLZYexrR3l79yP4OWdbvBgNY9J21uGOoG9oWoKEtBZL12d2obYFpOnyOnO5L6DgKRnSIF1TDYM4cmsgGX98duRyilBccEt0uPMP6gXmfUgi7f++G2PpbtXKC1ID400RncvCWSX9cC79Dz3z0l28PN52KEhlL92p5vvEBVSNz2ZjyHzePr6Ep5GGkAGb0qnqmVHpfnJc9jehvDBJ2F/8P1Aez5WHYcs/L3NG+Fder7W0ZnOKt7n8rmpJ649iR6jtuCV1Eo5aWv21DyiU2AAQs1J2jceGtJhT7rLmuAEw5PCSqkDkd3sOPnHCwVtx5l57zvc7mA5xgmIpCaUFlD+5nejAKLKvy/Bx/QMvLM3wsjjCRPmuTcAWZz7V1zkAs64hd4VlbqEkQm3mEqSgiV/X6azS6pIEG+hIkXw4cQk7NR045wSyPNTKGlqmQZ9cU93pC5rYspNgU4yLLRG7EIB3sY18LZtgZ2N36IVOoeoAyU5/di+U9/zrXL6IaeG3nlbgJ7O+Kd+lffgoRF3G8ed+kk3wvCVN1B+6HFkv/922MJ49Rsp0euReffbdcBhZdjhov7q/AL8i7ZpK+WkLXhDeYxTMzCt2kKdWh4YgFCz0i+rPQd00JO3dlX8aebFgg4mk2N3OTKvujtSpfXuDVfCu+gct6CM03q3pwflR55A8PTLMBIMVXsfpqaQ+f5bkPvPP6o7iUlTgxpGGGpxf2IS4M3NuAVo0l35IHBBq9xO3J1SCTpkkT4xlbwjVy1FdVHo7HDF+rEY14JXAulGWIBJ0fjGtS5giFOcXJHP6TUUSIMICYprOT+o0ZUDeDIfQzZ+4l7zcuonk/9nTvIelPeBFKM//DSy778FyGWq/xzTWrwCvIvO1U5u9vU9LmhazMukJ2X9mvJp55Kk6Xmw+4ZgxyZhViRM5yJawpiASM1JFgMHhl2nnQSLHAlcJJdfZx7ESTnRL+MQGWm929YRb6deahPCMoK7HnCLvioXyLJA984/C7lf/CTMwMCRDjNL/ZdvalYbIYt8KRiW2qHEJw7SPU2mJcuAsbjpPJmsnsZITQoaZVCZLN7kNK+rCyaToL2wZERKnn/QOKmA8v7Q3egknYWlQ92LryF8bofbJGilDnXymTLQ62YmxTz105oPaR6itUEn+My20WDHp1/SkxATM01STjK8wUH4117uNi8W+zJJACLBt7TWjptaGdV82KER993Q4nV81Np4AkJNyrhWh7v3wb/y4vgPUXb2errdtO1Xd1X91+3EtO62SetdzMfofCVful2dCF56FcFDT7nJ2tUuamfn4J29CaajG3ZqMv4CoZlVJhNLXUmSExAdpFZ2xceySLcxu51pjcuoLvh1nkgjLGYlYGhv0zkIKu59khO5g0NRKmJbTe9iXN6GtdoFLenzHDzylOvg1N4Yjys1slEj3ej8LBDMxPqpWhs0NqvvQ3OyTQDZKBgeR/m+x5A7d5t7vao9UZaFvwyZve5ylP/52+49droBh3Ltd7bDyNDTJOSUZ37ObXTIxgJHgFAL4wkINa/Qwu7Zl6xnO9yCyduyrvqOJfLlGIbI3Pw2mK7ueL3n5Wf6GQTffdgV1cfZDQ9cW1j3ewYfJ2R8QAKQ0clkKViyuJHhiNK4ADEX6dJyVwaVjU6407tG2UmXwKovCsbjTp7Xh+LB7jnkiojP9HwTbc3a5Wa2JHmeoyGWsjvfFB0eqqEzVHJaG5FITk4hZ2APDJ36c87zEDz4OMLJyXifh5qSOgPvvLPgXXD24j6XpSNiX49rwBDGT1+VIAtynegQwhYeIkstDwxAqGkZF4CEO/cmKyqWRUkmA2/9uur/rhz1b1kH/13XaS1JrNqPrk6Ee/chuPt72hK46iN7ybOX7jQ6oZhO/CTrxaLtPxMv9gtFzes2MgMk7kJFFkilEsKR0cZK0ZD3kewCy0IzdgG6DyzMu+CqEWabyOslRfWSWpOgUYWmX+3ag/DAEEz7aXbTm42e+PXBrJFrPkkzCAPIyUDxrQXox/ypni6EL72G8PFnYTpjzgSRup+2Dp2MvpimIFZbsrfDkzknSQrQs9GckzcPaL0MUStjAELNS/qtHxpFKKksiQqLjVtQVllUKrts/k3XwFu3Lv7k3UwWwQOPIdx7EOjqqP7IXk4/urvcULwEO3dNTVsUl91gvIQtb6UzlOSp6zyEuAuVKI3L7jvUUHMCbBjA5HI6AyR2e2FZgEl3Lyn2b4Dien29pHahpzv+Y4Kr2Qlf3+N271ttYSnpSTL3prc3fm0E3Oed1gad7kNOPodDqympMtDymHkh1ZCZIFdc7AYLnm4wrDxGqX1K0lpbeEaLzyFF7JwBQi2O7wBqWpLLbyenNN880W6TdP5Z3q+FxToUbjFk3sbyfvi3XO8W/nEKdqWQfnIC5Tvvc1+yMXbnrRREt+dhlifYtW52xi1EtS4haUqQLGi7u2B6e+IvVGRhMjsPu38k/uKqHiTVprcH0PbC8Xa6pdBbh7BNz7p0lDNNXiOZJSMFzXHTE3UhGcLuPwQEtuUWlvoZ05bX2USJFufWIhwdP/3rIHVx3Z0IHnsWduee2J/t2pxjwzr4V1+inQpPSU5MNqx0KV9J5sRIB7ihEVhJC2b9ObU4BiDUvGQBf3AE4a79iSaiayes5f2une/CItovSgrN3ILWfvjnnaXdlWKlX3V2aa/68IVX3bTvOEJXOIwku9bNLjpxCHcfSB6ABFGdhKSGxAz4TFaKccf1WjONNIQwtPA2r3WniXFT1bI5hCPj2h5bB/WdabKwbG+XyCh++pB0qZuf1917V1jcYpXFcsra2eGGo5aTzePROjcNbk/zPszJZ/swyt97HMjl4z3n2qHQwr/xKhhpxXuqz0dr4a1e6VpiJ3l9jdQ/HWiM+ieiM4wBCDUvz00h1g98hPF3JnVXu8fNCVjMCUjZdUyRYVexySKvsKDF5/plHLeVsOxOSk2CLLISTShuYtL9aHZeB4Ml2r2O6oX0pCwJP+O65BQaLE1Ddp6lJbWmIia4mZFo0nUjnICER9WHxX1M8hotFF0KX4tOttbOaFJUHbfhh6ZBhm44pZwwnG5trjUiBsGjz2hXqVhpb7JRND8P/8Jz4G3dAHuyuTZy36QGRDpgZZMGmAbh7v0MQKjlgQEINTX5osjnEcpE9JmZ+IuDKN1EhpVV8o9PSr7UxifhX3Eh/Ksuhj1dbvGJyEKvsxPlJ55F+fHnXevduF96ssO7frUb2tVqO7OLpFPHxydgC3LikGABKdeFdANat9oFvHFJAHJoxA1ka7AULCNFuHE7c2kwFQBy+tEogVVlYGCi2zB67WgHtUZ6vdK0crk7rYv7ESP1etMuNW+xdVjSlSp8ZjuCJ5+H6Yh5QixNCJYvh3fjNdqd6oRBgQSpvqfDaF2+ZtzZPhnY2Rlgcqp1rxOio3AOCDU1GQZn39jn8n0lDQkxCoPlS8mG8NavcmkGgXRpOUlRu6Td5DLIvPsGN7CqMFP9TpcuzgyCO+93O8WDA/FfIpljsmUDvHw7bOLduwahO5LleMHdicjiR3av5xaSFaFL4JjNuunGCU+bZFKyvvZdMRdW9SBpKHIKCC9me2EDKycFcgLSCCcFUmvQ0QavMtckJiMpWHoCMgmTacE9PWlTvnJ5oveO1AZps5C5ucWf9srPK5YQ3v8ocN0V0ed0ldeltkoPtA6kLK2YpXbv+BkuC0VtPa2nPEH8DlhSkyjtd8PJ6WSzhoiaBAMQam7SdWdozE2UHkyQGlMqwVu72n05ycLwRN8f8v03Nw/vnC3wrrnUdfqJU/shO3uvvo7goSfj135UyJd0qYRg6JDrvtIMZHGfy8Ib6D+cZpeIl9HuRVKrIwXkiRgD098f/z75UUHz+ERjpV/JFGiZbSLBcNwZIPJ4ikWEQ6MNsgMcFYzXYtK8iabyt+Iho5yMDfS5WTqxa4OyenIsDQoW3ZwgOuEuP/kCMgcOwSwb0Nbn1bJT0/DO3Qrvwm0IHnjiLUMkpSW2N9AHTwaCFhPU0clmmFz7ibsyEjUHBiDU3CotVqVDzSXnxX6oUsAtLRjlBCScmDpJirLRL0A5/fCWLYOdmKj+BxmX21z+5nd14q8OfUvQdUXqVkpf+jpKX/5606RgSSDlv/0KtP3GL2n3qkStPyu3KScOcqIi7XPjPk1yjUi9TX93/I5Kkn41O+caFzTQIsUWJQBfATOQoL2wBFeSZiM1WQ3QgreS26+73klFQ0dbLq8/jDpgyeT/JOS6l+YEcorU073oG5Lid/vamwjuewzZH/5gvHbn8hj8LDLvvBbBQ0+/9b9LmpYEHwN9sLNJrhVPi+z1cz3pRHWiJsAAhJqb7+kOV/jqG8B7b3ELhTiLBFlQdnXAbFoHvLb7hH9Ed+82rHHDrUrFWAt+2X0L39yn3a+03Wmilo/RrvPMvKtjaZbF0fAEjHQ1a2sHFmIEeUeTnXiZOi7539KVKUlxtSzS1wy6RVHcNrVyvcp90SLtBvp4luCqp0sHY9qYnY7ksQXSgnd2QdOWzji5D/MLCA+MJLsn8j6X60gKoSVwbCUSjEpjAu2AlWBAnzyNo1EapAyFXOz7UILaTAbBI08j+6Hb3GnWYlulV2ggWoR36YXw1q50HdoqpyDR/fDkMfpZ2GCm6selomGndmyc9R9EEQYg1NyiXU6diC6pI7rgjLE7LbtkHR1aBxKcbMd9oYDMO66Bt2YN7NhYvAV/vg3B/Y8ifP3N5LuKiBZHUoDeLG91eTy9Mtl9xZF/TkLqP2ZmYWX+QNJdeWlHKjUbOisg7gmIr2l8mg7WSB2VCiVtjGC6e3U2TSx+BpD6D5k43igT3uNuSBx9E/Jat+d01k6471BLjXeQE0hNTdI2tglS82w0CFTb3FZzB6y2vQ6f34HghR06WNBWG4DIzczO60wQSZ0t/92/HUnDCl2dkNm8rurbPIbUCRUKCA8Ot2ynNKLjsQsWNb+2vJu+PDyqRemxaCeULIwUW55ovoPkyK9cBv/2m6KJ4zGWIR3tsONjCO56wO3Gc6fsraStsExdTtppKmIqAcjImBaQJ71v6O6EyWcTFKFHq69GWsXqDr+n17eKeyrn+24Im6TJeA1ybct7rBgtWOMGIqGFyee1TiDugMYlS1qU9/Vo7VTckzFUivil9XScxbkE7ZMzKEvL8rh1KNHr5l93uZ70HU4zlI2nfB5mzcrocz2mSqvmvQca62ST6AxiAEJNT3KU7f4hhENjWrSYhObuHr/bJ919JqbgX3Mp/PPOdnnC1a5lZCdPTj8efQbBjp0uD5ptc99KFvZtOXjSaaoWiz1ZDIxPuRqQtoTFyDqQrRPIJJ2W3GAqu8AbVicYlgGNquz+YUCmTjfKgEVZvErtT2khfsCvJ18dMKsGdZHZUnUgMqRVJv+3dcSvxdLFecGlPsVdnEuHqUefRTh0SFMgqyaf4TPT8C46F+asjUcNnHXBtwaXSTrbyfU+Nwcr6X7cWCJSDECo+eWyrsBx30GdRBtbYQHemhXw1qw8diK6/L6jDf7tN7oduFKMXGjZiS/Mo/zv9wDlsLF2wBuJ7sZnYKQDVi0GKxqDUHZeazEYTBseBO5kJvZNRa1EGymAkR1+6Tq2eYNrQR1HlHJlh0Yaqx5Jdtwnp2ElKIqbGqPF5xl4G9a4x9kqAz918KavqWeH/zkGrXuam0e454CbNB7nNnq6EL6yC8Gjz2oaayzSsry/D5kbrnKf6ZUmBdKYQjrAxW0sEZ20hsNjGuyaRgm+ic4wvhOoNciwsIND7hg95uLHym7f4HKYvt5j0jbkS8W/8iL4l14AOzUVr/VuTy+Cp15E8MQLyVvvNjErKVi9XbrgsAkWBMeQ9p+Nctoku+ntbTA98Yu9a04K0AeXubS3E6UfLkY0hE27e9Wi7W2NmKg1qqRoJkrBC0owZ292U/BPNlG72cj7r7MDZu3KZCdjsiCfnAHk5DhucwL5zM3lEN7/mJuMHieYjD4DvKsuBvq6tZhdG0tsXAvT2Z7s/SjXvwTfEtg0UnttojOI7wRqDb6P4JU3XIFi3ALYcuAm7i7rO7IbFn1pZd5zE0x7R7y0ID2SDxHc85DrY99AC7SGUyrBrF8FdLbXLN9eB6A1yK68ds+SOhIJcuOcpNWaBNhTM/AuPkdbDNuY98nkclpkLPNWTMI0yJrK57QwWFqjIhM/AJETFG/Lel2s2oUYrWCXIm3Bm3PzYcIE70WpAZmYdLcR9z0oKayd7Qie2Y7wlZ1aJxYnJtKZIBedA++Sc12wLAHIKtfZLtHnjfER7h9KdtJG1GQYgFBrkFzvnXtdi9NEObghvPO2upab1uqXlHfB2chcfyWwEKNHvDThkS/OF3do613tN8/aj5OT3XhpBCBdamqSguW5uRSl+Cdjh8nr5pkojSrmbUhRb3snzPJ+7Tx1xsmub8aDLwGI1LbE3QWWQXMHhxHuPejeO42i0iXvzf2ypR//GpCTufZO+Ddd0zrDCCVNMJfTWUOJFufSnGBk1LXPTXI6EKXaagtzPxsvDVI+X0wG2bdf5boHLhRcN8JMgs52lWtqz0G3wcQTECLFdwK1BM0zno26HSXpQmItvJWDrkWqfOmWy8jcci3Q06t5zFWT7yY/i+Ceh3ViOyfknoZ23emFyeWT10nI7n6ppMPBahL0RS10dR5C3FO26H6YrRu1ruhMd1WSgN3bvB7e1Ze41Jb4t6TzbTTQa5QWvBVSB/T6rmQpffKQSgWdAWQ2rYGVOS5NX4zuJskbnVeUZDPAg5X6iIWE7ZllYZ/L6UwQOzHuTkGqJa9ZsQDvqkv05EPuj1kxED3cmJ8R8l1RkNbaU667IRG5tyyfBmoJsgMr3Y527Y1fpAi36+dJNyDJHZ8vwKxdBf/Gt+mXVqzaDxnsdugQyv/xkEu9arTFWQPyJOUDNSj2lR3T8Qm3WKxFa0w/KmhOmue9MA//sgtgBgeqH6pWS7oYK8F/z83wVq5yu7dx6ClDGeGzL5+5x3IqEoC8+BowNZ3odFTTsFavRuYDt7qT1lrVKDUomQFiVi/XmrWk9Vh27xBQSPi+kTSs3i6Er+7SboJxOx7K+9fbtBbeZefrbchnfJIhi7LhJSflocwa4gYT0WEMQKg1RN1uNAUkCdmR7u1x+fDDY8i881p4m9a72QZxZHMo3fMQ7K59rv88s69OTjpVSYqaLMxr8ETpSdjklJ6MJToVq5BOZhNTrhVrgoWU1CkZWQBdeSHs9NyZCUqj2g+zeS0y73uHm+wfd0dfOgBNTetwzUY8FdA23W/uR/Dy6243Py5JyZyfR/b9t8CX104G6zVzuo10jZK25JJSl2TwZlB0pwO1mg1TDlC+55GorivG39dTRx/eJefDrFqmU95tMUE6pLQIlkYXB4bjz6EiakIMQKg1yMJHFlUHho9MRI9BUnZ08NaKZTAdbcjcer1bDMdJlZFTmYkJhN/5njv9YO3HqUkTgJ5OnXycZEfyMCmMHpmAHZtMPgU96qgUvnnQTTFPcntS3OtndSGLfMZ1VUpz4W5cqhtmZpH5wffCW7XK7ejHIbvSXd2wr7yBcN9BmLYEp4/1IqeZw+MIn9nuHnzc51r+3vw8zMAAcj/zCdfdqZlTsaRjm3RHSzL5XzaGpDBbZrHUaECfXGPB0y8ifO0NF1DGKUafmYF/8bnwr7wYkGs2SUMIz3cpZqMTPAEhOgoDEGodHe0I9+xHKHUgcb8IZBHc2aWtUv0rLoR3wblugm+s9KsulL/3OILnd8QbntVirBSIyunTsv5kO5JHXgEtjIYEILVYGOiwsXnXbjPhzrednIR/5aXIfuS97lQlzeDUQmulMh+8FdmPvE8XY7F/fhToB088D8gCLN+gC7Debq0dCEdG9EQkNh1KOoHMFZci918/pScimkbXjEFI2U1BTzJ4U9OTpme1+xSyNToBkQ55B4YQ3P+YK0aPc4IoLdeX9Ws6nW4mxC5Al/dTCHtopLUGVBItAgMQahmaarHngEuNiLvgjL6IvLM3w//+W/TLJdbiTH5+sYBAaj8kf7oGO/BNT3LOZe6AdN2JO4+iQhcDNkr9qNHHoCw+C0UEr+0GvEyyoCEIYUsFZD75g9oCV1rF1n0BY9wQRCnK9y7chuzP/4T7kcUEC2itsxlD8MyLQFfC+TYSdM7MaWqYBGUS+B/zS/7dVDRPolTdiaScZkrtQPj0i0Bbe7LXLgwRzswg+9H3I/vjH9bcfyv3qdkWoNLBr6c7+n3M50uuj9FxhAdHatueubMD5cefgx0fj9fWPHoveOduTfj5YNx7eahGjS6ImghXPdQ65Lh/ahZ23yHgnC2xH7adnoZ/6/W6GNYFT7ULC+1Z34Hyo88gfPJFmH623l0UyY4ZHNCFip1LOGtBFhWlojYmSNaW+bjbXCggfH7H4Q5BsV9XuaamZ7XgPv+5X0Dh07+BcNdemNUrAWPrUysUBLCHRuFdcg7yv/UZeAMDsKNjiTp6mfZ2lJ94FnbHrnhdiaLibikQl/eJWT2oKT+6WJX3c+X51Y5mZQ2W5NRBHgdGZmEG+oD2RXRM09euiPI370Xm1re7245bWK3F+3I/POR+9pP6Xi/+6ZejE7wmqfOSYHBZr6ai6uT/2DxgZAKQ96G0167R56AODnzpdQTPvoTMzde5Gr04AWDSjQ6490G4c0/rTMgnWiQGINRSZOK13Xsg2SKgVILX3wcrXyjlGPMjjJsVEdx5ryv0HRzgRXg6sjDJZNwQwlp8kcvws/kFhIeGa5Z7rtqjnfQ9++CtWhG/dgLRonpsAt6Wjcj/yedQ+NwfIXj8OXiLXVRX83OkVkFqPt57I7Kf+RS8wUHYsQTBB6JFPQzKdz+kJwCSTikAZQAAIABJREFUtljNfZJ0RznV8M7fCv/Ga+CdsxmeTN3u6NDucXqKWLkW5H4uFPU1lXQeOekMd+xE+Y7vau691g2d5vmSYuPge08iePgp+NdeBTs2Hq+IuXL/oyA5+8mPwaxZieJf/h3sa3tgZJBpW/y0parug8xt0UW0e//UbAienEauHIBZ3ueCkbj3DwHCiQl3AlzLTRgpbp+d1BPmzM3Xx98MSHqfKu3fmYJF9BYMQKh1yBdAqYRw9z4dQKdfLnG+FKL5EZXfV/3Xe3sQvroTwf2Pu3xlWhzfgye7pLX4IpcFydycOw2rYfqb6eqAfWMPwudfhrd+HYAEAQgqQcg4vI3r0PYHv4LiX/09yl/5pnZ00/SXtphddeQ5lBMPXbDPwlu7AtlPfRTZH/mgC86kZWiS4EPeWv09CF97HeGDj+uOdFVCq4FE5sO3IfdffhSmt8/daKEAK4tqafog7Y6PJjMburtcgLbtLOCWG+DfcCWKn/u/YfcP6YT5Uy4opT5loozSP90J74qLXICTqPjYuJ33YgmZ298J74JtKP35l1zL7Ykp1/VOW28nOCk7XrHkPpuK0a+ONnfCKvVOEgzL65rNaspZkoGQ8hp4UrfW3aUnO7FJepLUYNWBnDwFTzyHcNdueBvWusnmKTPZrNYcagtrTkAnOgYDEGoxBnbnHldYKzuCaQ96k0Wd56P87fu1M4qRNpZMv1oUI/83uDz+rvRRZDBlKLUEY1P6+9rdSQNksgjuewyZW6536V1Jr7EoCJHgJv+Zn0bmxmtQ/qc79TREi+hlVoEsrjWVrHItHf8kHfn3UqfiZl4YmHWrkfnIe5D9yHs0YLJzc8DCbPLWv9FE+NI/f1sHbOoshSpOrmQ+i3/9Fcj/nz/ranVkl1z+/qmCT735AFZqVmSxnfGRuegC2I99Hwq/9adu0X2qfP7Qwgz0IrjnEZS/dS+yH7jdnQIlCXgrgd74BLw1q5D/7V9G5rGnUf63uxE89LR+Bugwv97uY08BTvYz7VG/kT8jgZqctEixe1AGlvXDWz0I79Lz4Z17ljsxkkBHbm5uAeG+Qwif247gwSc1yJShnto8odrPIOkGKCdR/X2J5sNI8CIND+pyOiBpWPuH9UTL27TxcF1HqqTGZWjUdcar5UkrURPgO4JaislnEco8kIND8NevjTe9PAFpERnu2YPyfY9GLSIZfCyK5ONL2pHWy9Tg9mRmh+wGy+5tLWc1aH1Puw5CC57bDv+qy6JUnoQLLFmsSS2EvwD/6svgX3Yhgudf1uvIvrYL4Su7ounblUYJxz1JcuIXpf55awZhrr3EtRm99nKYLZtgJN1JCnZtvFO9Y8hz0N+L8MUdKH/nQdc0oJq0OUnvyeeQ/YFb9R8PP3/V3C9NPwphbQBz1kZtF6spUe2n2fWXRWIui9KXvw7/+iv1NEW7kCUNQioND3xfXz/vsgv1+ZHrJHz5dYTbX3f1SHItynMVnKC5hfyjb1zAEJ3gyvPknb0eZtUKeOtXw7/4HJgtG2FWLIfJRY81jE4oPB/+lQb2+96J8JU3UP7Sv6D8zftdOps8L9V8FsnLMdALI62sZ2NOyK+k2ckpZD0CEHk8bXk9ac687xYXoCc5rYnDy2jQZ0fGXT0SER3GAIRaixQwHxhyKRlnbdW2qanKtyG49xE9hdGFGS1OsQSzcbXbKa5FYajkiEtnGknjqeUJCKK5EkOjKP/rd+BfdbmmvNRk4VPZ8R6f0PvsX3we/Msvhp2ecjUPB4d1N1kKxyWt6nBqjzHwpO5AdsdXDsJIALJ6JUw2D1tYACYmYaUewZzg4CSOXE6Li0t//3VgeBzQQuXFkxMab8MaeOdvg51LMEejMs9EUqnacxrAnfaWdJq2m1tS+ssvI/9/fdq9fpLWVIMgUoIL9/r58C88F/6lF8EuzGlaqN03BHvwkHsN9xxyDS6ObpAgj2XlMpgVA/Bk8T/Qr7UlZtVyXdyaXJsLXgoLsAsLJw8MPA/+OWfB/81fhtm4DqUv/L2ci2n90qKCEPkzksa1cY3rAhiXvO9m56JToPrUR0jdUfjCKwhfeBn+dVfVZjNg0T/c/RxtiFAoudeSG05EhzEAodYiaQ5Ds7Bv7AVuRKrH8vJlaKcntShXv4z4hbQ48hIVCvBkBkh3p6sBSMxDeGBYT8A0qKn1XR7oRfm+x+A/+Cgyb78meSrP8SR3XhaosnDLZeGddxa88892C9ATXVOVEwTZOS+XYOcLUYF8dJ9qddd08GAXyt+5F+W7H4636ytNHi47z512HV/nUS0TnZ4FdvGPUU9w+lD+2rfhn3sWMh9+n0sBq+V7VdKypmfc73NZ+GdvBs7ZeuT1syfpdGaOpLe51xO6uLXya3aRmylRECQndflPfVxrX0p/9mWXIrSYOgVtCOFHHbDiM76vp9HavaxWU9CPJylQhyYRPPgE/Ouudp+5aXWjkp9VXHBBDwfNEr0FAxBqLfIlIBOrh0dhiwtulzitOpC2dgT3PaLpFjrAi19IiyNrMWlJK6k0vX2wkxPJbi9aiMq8Cy3UrWUKVoXsmk/NoPxX/wD/0vNdy2bZka5lEGKi67lQdHUdR//741fbJ7zWarwTLD9DGiwMDemCVhd61XY3ilrY+ps36I5+4nbLtlJ3ZapL3ZOC9PY2FP/0b2E2rDkqlS7Z3TkhKRw/upPUiV6/4yX97JCUvtl52HKI3Cc+qm1ig3+9W09TFtOdy0jdSk/CwD2bdTOZ5uZqW4d1HEl1DR55xnWmWz3oUhXTIB2wZDDp8Kj7PCCiY3AQIbUe+eJ7c78rnqzjF98xpLNQcQHlb93ndmTrlQUgi75m+lUOEO45AP/S85D5wffq8MbY8xkq/Iy+FtoVp16FoVEdRPDkSyj91T9q6l3dftZbfjaO2kW36QS6OvOjzcUPf/TXCN/YEy/IltMtuZ01Kw7fbiISXEr6i9SVVJPmIwXpcto2O4/Cr/0Bgldf05SndJ7LE7x+9Xg95cUquBOm7A9/AGawH5hf5ImTpvX1J/v5fsZtBE3P1bdDlAyZ3L0PwUNPuKntaZ14ZzOwE9MId+6FidutjqiJMQChlmNyWd3x0/kAaSwKNa+8B+EzLyF48gXtZlTTYWSVCdayoF4oHun9v9R/ya6+DeHfci3yv/kZeOtWu7ShpKcIkhohO5NRp6S6kV3igV6UvnIHSt+4C15PT3POAohOFdEuA/f+BsG3HoBZNhAvyC4WXWrPigFNFUss43bZpaVv1QGgvG/7emAPjqLw63+EcM9eHc7YVOQkZHIS/raz4F99iUvrOx15b0pNkdSwJTw9lg5pMhxWgpG6Mh6Ch56ELS3UbvDo6cim08SkTu9PbaOLaAlhCha1HtkVlUnFMoRu49r6P/xo4VO+6wGd34Bat96V4ENmOVx+AbI/9P5ofkHK3V5qTdJAZO6HTFveuF672djxyZos4LUT2tCwFqFLF6G6ibrwmFIZxc9/AWb5ADLXXrm4lrJLRaUgubMLpa/fidL/+0+uJXAmXn2T1AN4PV0uvadG17A0ncDkjHvfVf2XrQ4KtS/vROGXfwf5z/0i/LO3INSOYTHnCDWaKIjwLj4X+O4j7oTxFAtmSRfzpBheNlLi1mNVTqN0NkcV9TkxSb1L+Mx2LUj3L7sonWJ0CUBkAGEY1K/GhWgJYwBCrUc6IE3PIXhjj048rjdty7rjNZ2yrJ2vapwCIG1C/WsuQf7zn4XX1qGtR5uGPFdSMC072LVaMPhZWClAPzDiFsv1JPe/pxMYm0LhV34P+I1fROb6q3XXObVi2HqJTj40+PjaN1D83S8AHe3JprRLqlRPt75PpLA6Ea3tCV2XJZNgB1pqvZf3I9y+Ewuf/u96Gpe5/GKEU1Ouu9lSD0J0VkkZ3tpVMPL6SUByqh17CUCW9wMydDNuACKfwYV5t6mQSyE9STYw9h1C+buPaAvrVIrRjYdw70Gtz5IaMCI6FgMQaj3y5SopOK/sdg+9np2w5Hb9LIK7HtR2jLKQqSlZAMhogPfcDNPWgbDW3ZYaRU0fk0Xw1PNuhz2N50rqCfp7NL2l8Ot/CPzazyFz47Wuxazk3NepBWldycmAzLHJ5lD8m6+i9OdfdnMnZOJ53OADrobJLOuDybcnL0CXHWgp0JdW29kEO9CV2YArlsGOTqLwf/wOwk//GLLf926gVHTdrJb6e07a6coiWU6uThdUSNDV3Q3T1u5mm8RRSYOUACStNNi+Hjd3Zf8hN98l7gDFRf48Hfq554B7j8t7hYiOwcREak3agWUcdn62rl+AMuQrPDSE8t0PutSoWpOJxMv73cTjoAazCppZNOMh3L3HNQNoO81QulqSny0TqWcXUPhvv4vS334VRiaYL8XhZJ6BGRiAnZ9H4bf/GKXPf9HVUknwkSSQl7+by7hOTLUgi2mZMyGLwFrsskeLWMzOo/hrf4jC7/2FztuQ56IundTSIq9nJnekq9XpXsMgiFpXm9inCHK9SEAe7h/SlMhUyHyeN/cjePxZ7UhY12J0eT+UCm7Y6VK+NojqiO8MaknSlUTyc3VCbT2/AHN5bb1rd+93C9Aas7IY6OxwxbLFGhTtNjNZFHgZlP/pW7B7D2pqXKqiIERamBb/8H+h8Cuf12DI9Pe71JdGb8scnXpIK+TgqWdR+LnPofyVb2ltS+LgA5XTQh+QydxInh4jXYiky1K4a2/tan201XC3vt9Kf/MvWPiZX0fw6BMwvb3aaOJwB6tGp8Fezl17XZ0I3tyD0tfucLv12VNsyMjpSFcnzMqEQaKcQkvBu7ThPdXPqyX5mYUigvsfg5Xp8PXceMpmEUrzA6lxqcfGE1ETYAoWtSaZkLvngJuIvn4dgJNMDU5ChmBNTCCQ4nNZABnUtvsVoiFrubxb/NRi2nbTstqFqnT/wyj907d10FxdXo/TkYWfBIxtbSh/634Ez+9A9kc/iMx73qEdljQvXhaBYf0Lcxd9f6WbV0e7pljJjI/i331dp7zLSYB2rDJIlnZ11M+SnXFJj6lJMJPJIXz1Da33MrU87dL5Jll4q1cgfOk1LPzCbyHzA7ci+9H3wdu0UU8ideaLpkc20Imk3BdpGCA1Ol4GdnYGwQOPoHzvIwgefgp2ZMKl1Z1qQKpsePR0utc9TPB5I4cnsjiX6ybFFETtRvjUCwhffh2+TNofTzhT6GTyUfe1kTGYNGpciJYgBiDUmuRLT6YHS4cchLWvA5HFlBTn3nE3gud26C5jXRa72qWnz6UTHT2Mjo4wBqavH8FLO1D8rT9xRbayK3mmdqqtW3RpTcHYFIq/8xco/8tdyHzoNmTecS3M4KAWBduZmdos7OPyPZ1qLh18wr37UL7rfg08JI1F03UqgUKt7qK0hC2XEWq3s4SH81HaS/Dgk1prgHqcdsl7T56DhQJKX/46grsfQuZDt8K/5e3aKUv/iBSqJ51bk5RMLe/s1NfRzs0geG47goeeQiAL8We3A+VQmwdoWpU8b6d4X+iJa3s3PElFi1uArowuzvX20joBQbQpdHAEwQOPawBSNxLgyQn70JhrPEJEb8EAhFqTdQuscPfe+pwc5HKwYQnBPQ+5Ymcpgq3TWlImNcskYZv6dn6DkwViV4dO1C4//AQKv/1nsGPTOpujUdJktAtXV4fOpSl+/osof+M/4L/jWvjXXgZ/y0Ygn3cXq047r8EQxlORkw45qdPTAgMszCN49gUE33sK5fseg31tl6tbWTlYnwGHEnQsFDQ9Tn9+3Da3OgSyH8FzLyJ8+Km6pD4e/bPkdFNOQzAzj9IXvoLyHffCv+FqZG6+Bv6F5wA9HUdew4VCfQeRSgAhqVXtucPPoRTJB08/j+C5l7UNbfjsy674W17vvt5jg47T1n+Eh2uXbClZyqdOCA9SPiWSx9eeR/DAY7Afvl1PwlHr1FXjjlbt0Kj77D9NUEfUqhiAUEsLtr+unXL0RCSo3ZeE5lU/9rR2XZGuPvWMDTxJh/DrF+AsKfLlL1/42Qy89g6E83Mo/vU/oPTFf3SLpwYKPlR0X3QhGIawr72J0jPbUf6HQXjnbIF/w1Xwtm6E2bAa3vLlbpEubZbLZdcCVafe22MXkCd7fJViWBMt9mUBmvF0WF9lToEdH0P4wg5NXZLAI3j+ZUAGdvb1wixf5qoG6/X8yf3K5xE+/hzs2JimpIXSKUwep7VvXSQfvXCVxybvYUnhamtHODmhQxE1/UrThep8CiGvQUebS2+anEX5S/+K4N//A+a8s+BfdTH8yy+E2bgG3rKodiIsu5otCUbkvp1oyvmJHqf8Xn9VHrMHI5sbfjY6NbIu7XPnLi2+D597GcHzr8AeOKQ7/2hvh+nucCc3ldut5vWU+yqnHxLUBQlm2Rgfdv8wUEy/jbGkE4av7EL5sWeRfe8tsMUazwSR16Vcgj047OpbGHwQnRADEGpd8kUxNunmYPXU/pi8fOe92vUIkv5Qry8hafG6dROMlwH6lmBHpZoLdchkODaJ0rfvR/kbd6P8xAtutkRvewPP3ohy4aVIvbtTW8cGjzyD4NFn9UTC27IO3qb1MGtWwtu2CZ4Ufvf1uPkCmYxrpFAJviSgOD4alesv2rG2JRfAYHoG4cQkwqFRhK+/qXMS5EQw3LFLTyL0NtrbYNauOmrxX99nQU4rwgPDKPzq7yPzIz8A/+zNruhZFnI6Ldse1TvFHvlVKsMuFDWtp/jMSyj/4x0IX3nDtb1O8zWX16AtByMnIgsFhE+9hPCpF1Fqz8M7ayM8ea9uXgdv81p4q1e61EwZVikpgZkokDj8JB//OI2bDi+BqryGxaKmlwWSyiTF9m/s1UWv3XcQ4au7YWUCd5TCZzryeu0kFoTwtm6A6ajMz4nz3LrHpScEZ+L9KNeSFKM/9KQGIHJaVoumB0e4xxdKAJI0lZCoiTEAoZYlhal2ehblf/4WvPVrXHpELUiL3+lpBI8+56YF13MHTFpLPv8KyuXAzSNoVdLhJhpYKAtpTTd5dZcuCL1VUcrQUhj8VzkRka5YHVHdgiw4X9mN8tPbAd+4BVN7Xgu2ZUo3lvfBk1M2OQWTgEHqM45+rBLYlILDBbfh+BQwOq474nr6N7/gUnLkpEFqATqjYYLH3ae0Hr/c/+CJFxC88IoOx/O2bdbWvIcbLUgw2Z53HYai96w8tvCNfbC79iE8OOTmicipEmpYo1Ll49AhjZXuW/Iabt+pwbAGHP09rjBeHsu6VXoypx3SpAhcHmMu62oH5HBkYtIFj/K5MjEFLCy4FrZvHtDTKU2FmltwMzm0nqPNNQ3o7UKtc730eT84hNJXv+FSl+JcG3KdBmUNfDUAS5tcY8v6NBWt9MUvuxOdWtbPyeOTE8qX34DpaEv/8REtEWb41o+PAajxdDSiJcC4PvZ2dKK2E42j3HVNvZIOKPVc+MqwNek1L52TTjW9uBVU0lNkAZCLuv1kmiwFQna05VoNAlh5XJIGo2k8UUGwBlonebyV60N2ZX33S1oC6/Mlu++NNq+gWHKLa0nTkcdXeVyV91Pl9Ub02OSEJOu7rkONnPoidQ+yYRBEKXTlShpWeOxrd6LH6ZkjJ13yulVeQ6/yGtY5ncmLhrhOJRi+GB3maIAlJz9nYmNAnqdiGVaCcZsglexEKp//EtjIyeSZbCRB1LjGGYAQEREREVFaxpmgSEREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqWEAQkREREREqcnwqSaiujHG/bLW/aLaC0P3HPu+u2k+16cmz9Vi1eN5rPx8/V+r/8/Xi4haDQMQIqo9WVwVirBzc4Csj9vyQGcbYDy3YCZHnqf/v737DLLjuu4E/r+3+703b3LAIGcQgSQiSZAEkyARoEhRFEVJlmTJWkuutb0uq1y1VVv2ftkP+3U/uMpbWxvKkmXLtiJpUhJJkZIZJOYEAiQSEQkiYzA5v9fdd+uc2z0zAAEiNwjw/6sagZh580K/ftA9fU+IE8BOCNTOSRp4DA3DDY0AgYUphEBt2X+fx/pEcoyrVaAa+WNjLVAo+JucfOzltmHg35uJPzrfYEHuTx5zZBSIYrjEv+9GHj8M/ftlGIwQ0ScDAxAiuviiCGhsgJ01Da67FxgagRscgZEFXanIRVamUvXHSuhCNDj735VjGMfA5BaYaZOAY11+gSsBTRT7RXNgeawn0mMSpLtFfqfIZcdqIgkEYgczMup/JMfRGBgJWML098/1uCbO/15jvb7nJtv5iCK4SgVG/ls+G2OByCU7CkRElx0DECK6uGQBPDSC8O7bUPzut+GOdaD63MuIHv2NXyQH6SLuk7zAkqvhaeBhZk4BBobgOnv9ArRwFv8s625JAtc/ALv0GhS+8xUETU1Ijnch2fMBop88AXfgMNDafOlfy5VCgrUohl25GMGyxTANdUg6exHv/sAHbsGE4K9YAAaHgN37gd4BPd4u252Q3RP5kl2L0J7deSy/OzwC09oMe8cNCBbN10DEdfch2bkXyY69wIGjcLI7IkGO7BQyAiGiqxgDECK6uGRhXAr9lfnGBv0qtrfCLpyH6s+egNu4Dagp+8XbJ5WkAUkXkFuWo/Dl+zSQqPzwUbhtu2Hqav3i9kwL0Njp/chiNly1Qr8VzJ+N4LqFGohEDz8FI4vo+jrugqTpT7rTMHcWgnvXwrQ0ww4MIewfGL/NxNtrgDcI19mnwZzr60HSO4Bk4za4996Hqa8F5OtsggX5cVjQ3UDX0QXzrYdgp04FenrgblkBVGLE7+1G9Se/BPYdBuxJARER0VUm+OsFy/8GQJlvLBFdNANDsDOnIbjrZriREaCmhGD+XNhli4HOHrhd7wOB8Vd7P2lkcds/BNNYh8JffAPhzTfCzJ2JYMFsn46z431/xfx0AVpax+AGhxHcfgMKX7oXZko7XG+PT+2prYVdNA84fARu1z5/Nf9cCq+vWgZGdudqCgiWLkYwcwZQU4Rta4GRr9am8a9m+e9W2KlTYOfPgpk/C3bxPATLliBYca3+PRke1F0LJcf4TEGe7PrJ+3voGFxXL8Jrr4Fpa4OpK/vHkscZGUXyzntaz2NkN4yI6Oo0wh0QIrq4JKgYrsB19fngQ9Zlff1wYQHh7Flwn7kV8dubgcFhwAafzMWxLFat1QJkVx3VtLVw+fVaoBxv3g4c6fJXwO1pjo0U+I+OIrh1pQYbrr/P3+doFS4egG1pRfjVz8Md70Hy9jagrvzJDPYmktS/Qohkw1bEb23WIESL97ManIycr4UAKKYBQBzDyH8XizBS5D9rJuwNy2AXz0X8k8cRv73NBwy1JV/ncTry/tSV4Tq6Ef/qWbgv3gMzuV3TsOTxNHCcO8s/btTPOhAiuqpxDggRXVwmK+KVYugk/btPU9H1VF+a8oLzCDw0Bz/WhbamMcmfUTLe7vdKUS7BDQ4heuI5uM4eXXS6kWGYBbMRrL0ViCOgUjkxaDDpgZVgrlJBePcaBCuW+gvvcqzlTwlYpMNSXx+CpdfBrlrqu2NlrXo/ybJdpcERJLv2ITl4xAcaaYG5T7uC79Ym70d3D1xPL4wUnwdpJzdjkHR2ag1IeMcaFP7s67obgizQPuP5iwmfhwkfgyDQYD05cszXqgRnWVtCRHSF4g4IEV1cchVY2os21sHU18END/uVl7V6tT/ZvR/uwFGYSa1+gX2u9QlWCoKt5uijYHXXwAzFQLkMFMOPf72DPL9ySYub42dfRbB2DYKp7VqAbNvaEHxqDZKXN8DtP+IXo/Ilr1cWpXLFvliA/dRqFP/Tt2DnzoTr6RtvtytpPsVAa0h0nSupQWzF6+lpYfwOw9AwIMetrdV/T94TOWC1NX4XauNWJK9v0s5kdu4MTbFyQQF2civM1HYYCRgCC3v9EoQPrUf1h48Bx3v873+UtPWvaag/ocbD1NQgOXoc8cbNQGXE77gQEV3FGIAQ0cUlKSstjTCTW3yQkKSXfeUKc9+Anw0SnkfqVUVa+0p9w1xt8YtRqS2p0YWfdhEarVw5V/o1zScERiqIX3oTdv5MmPZJehU8mDsD7r5Pofr9h7U43dTX++MoKWtFi/C+tSh8+w9gJX2nt88HKFK0Xirq1XpNKZKaazgk8nMJXM5nt+lqlLYuNq0twORJ/nzRgFVa5IYw5VpEv3sNle/9GIkEyTUlGDm2QyNIevoQfvmzKH73W75lsrTorSkiWHMD4udeRbz/MMyZAhBp+VsqwCyc5dPiXOy/HxaA/gEk2/bAjcZ+t+WT3jiAiK5qDECI6OLRuQYJ0NrgF3nZAk92OoIAyaEjcMe7/BXec10TSxvTmVP0inOwcqlPxXIOplxGtPFdVH/0C7i3tmrB+8e+g5AcE3mecrX9Ny8iuGEpwvUz4Aa6YVqaENx5C6JfPgO3fQ/sHTcBXT2It+9BeN/dKP75N2CamuD6+n3qlRxLua9KBfHufUj27YfkDLm9+xC/+CZMcwPngQDjQ/6iWHfmTHOjT+GT76WtoZMDBxE9+zKSHftgZF6HnL5dvXru2eULEX5mje5SSfrcWJqhBIZyLp6pxkaHTsYa0OguiuxOpbtbLqkiPnLM7wzqcEm+X0R0dWMAQkQXj/NDCLWL0OQ2H4BkuxKSOtXTByc1INaOp76crdDqjoDMy7B1DfpLrloBCkUEN69Csvk9VF95x1+xDq+AVCw5BpKW1iOzIPbB3d6vi2A3UoGZ1IbwoXtQ/enjCG67AU7mhLQ0InxwPUxTo9Z46BBDuUJf34Dk6BFUH30aybOv+knbhaIeK0R1t9+7AAAgAElEQVTVdPDjx+D1Xm4uTQ8shjAtDTCFkp+74dLdjyBAtG0n3PsHYGVnrVyjO1TC3HUjin/6dQQ3LPc1NRI4SOAn53RHpw9IzibojRPd3bBTJ8MVSz5Yl0BEdgYPHvafiU96swAi+kRgAEJEF49Mlq5UYadL+9LZcLJQSxJd3MniShfSkn9/Pous2rK2MI2eeQXhbTfCTJrkF5BDQ4hfeQvJa+/AlEpnN0Pj4yC78l6uQbzhHYQ3L4e9cbnf2SgWEKy7A0lfP+zsGVrrEaxerq2NfdpVAshrjRNEL76K6PHnkLz1LtyxbphSwS+GJU1I/pvBhyfBR+Jgpk6CmdTiv5cdmywY3HcQ7nAHnAwaPHAEaGlC+NA6hA99FnbxAp8GKOecwVh742TX+7pLcjZ1G5oeVwhh5830aVbVKkxDA+KX30D88NMwkpqoASPfNCK6ujEAIaKLyGkqkG1u1DQTXSjLDojUO0jhrsxA6Oj2C61zrdVId03ce3tQ/fXzKHxhPWxTM6pvb0Llnx4Bduzzk7/Pp7D9ctAAJIApFHX2Q7z5Pdgbl/onIgvT+noU7l2rx04naMtr012N2C9S6+sQ/fuLqP7Tz+He3QVTW4aZ2nbuO0ufBMYXgEtALPNW5Nx0UcUfJ5d2pbIWyb5DSKIKgttWwk6bAnvtAoS33wgzdSrc4ADc8IgvIq+t1R2Q+M2NWv+hBe0yQPKjyPkudUrSoGHBHJ3/obuBcIh37EW8eQfstMk+eGTjACK6yjEAIaKLRxZ0MgW9oR6mVPYdsMZqQELfcraz2xcAn2sNiNR7NNdr6lH0q2f9sMNFCxA//wpw6LgvTDdX4OJbWucOjyI5eFRbv+qUeEmvGh3VAYM6vE5mRWQzQQLf5UrrQp55CW7bHq0b0SBPXzqDjw9LzwvZjZs5FZD0QNnlyLpfadtjh3DpItgprbC3rEKweD5MYzPc6Ahcd7e/RzmP62t9LfvvXkH1saeR7Nrv0/7OdO5lAWdTA0xTs9bp6K7L0WNAV7cGmVdM8ExEdIEYgBDRxaGrMgfIVOlpbR++khtXYQZl8Fus3ZrOiyzQ5Gr1sS5U//FRv3CU7k/Sgvd0Q/vORH8t/d3LsfiTh66v01SeeMtOBHeuAZL+NAjxNQhjr02L7mv0OFQefwbJhi0wxRoffFy053OOx+J07+XHaSGdFYxLq+PpUxBMbtNaGx+AGD3WDlWED93rj6X8PfbzVPT3kL5OaccraYBPP4/qz58Euge0JkfbP3/kroXx9SRNdbDXzvdjQOTcLRaQ7OuCO3rcBzHnew4TEV1hWO1GRBdHdoV5UovvMJTEY1d9HYwOfkv6+y9sxkG22JXH0Xs1Ppg51bpNZ2gk4y2AT3d/1UR3U3Q2ROLyb+MrKT2SjrN1F+Lfve7bC2fD8TRFaMJC3vgOScnRDsRPvQB3qANorDvzY8jiWI5HNqzwVK8x6xIlx6F/UHdePnKXKhveJ4GS/I4UYstxHBhMA6fTvC+Xgxag+wF/0r4YxfJ4YJG9FjmjqlW44VEd9KivS6ekp8dfUqcKIeKtOxA98xIwNOrbSVdG02P1ES9WflStaGctu3CeT0GUZg1hCHe8E8mRo/59OdMOSHbMJ36dC3Oa+zjdZ4iI6BLhDggRXThZwCROrxqHUjQtOfbV6nj6laylpbh6uHL+LXK1janzE9BlUVhKA5m0tapehc5SbUZH/QLTTEhbKoQfXrDJwlFSnNJvG7kqbU5xu0strW+J396C+LmXEa67U8Mraf86cUdC6jyk8D56/hVduJr6M9QdyH1E6VR1DQiq/rHSmpwTFruyHpcr/+mugHHZ2O5TLYiz+636p5Y9x3Q3yUiwI495qmOeN1nYS7pVnGhNB6a0nfZQje04ZcHTxOceJ9oBK1y9EsHCeYjf2ITo188h2X0ARoIv6ZylqVinvns3WoEtl2HnzYKTuR/SEasUwr1/yN+HdtU6zTXB7HnIMdfGDm5896ZwFl3fsrexmg62zG6b3W9ahD8WjFjWERHRpcUAhIgujrTlrpkzA2bqZB8ouPEdBSm4dZVRmOA8N17lvnp6tTOR/fQd2kpVAhJ3+JjuHrju/nQhluhkacyepledkz0HNPVLcve1M1S2oJY/hoZhly7U9rZJdy/in//aTxavOcNAuUuhoR7ug0M6mDBcd5dv8yoByMTXL8+rbwDJC2/A9Q/6IYUflfoT+6v+mDMdpr6si12pJzGuBNRMeB/kPiSYaGlE4cF1Om8levL3cG9t8QvcwsQFbro47RvUwM1+ejXC6xbBtDT75xzHSDZtRfzbF/2OiBRnB+bylabEaYDa3gJ76wqYtma4keFT3/ZMwZKc3w11MO1tMDOmwa65EfEbG/15s/cgUFfna6BORWaOlGtg5s7SltJZi1/9XHT3w0xrP3UdSbqz6ANrvyvj9O+Jnxni0s5vH1WDkvgBjP73Y//7EuKa9ByQHRkJSHV3zPrnZz5i55CI6AIxACGii8elOxPyJTMOZD2UBRzHu7Rlri52zjbfIysSlv851qmL9PCr9yH87FrfYlYWVaUSqn//U0SSky8L4mntKPzZH8IumucXdz19iF/dgPilt/wCTBZrsiAblOBjEcLvfBnhmpv83Iz+QcRPPAfXO6Btf3MVWJiwgGTTe6g++iSCz9zma1tkNydLI3OJFqq7o53+qn5g0knzp6BD7oDgjpsQfvleP/CuswfJ1h3aucntP+KDMrmRHJdSiMIfPoDwwXt0Iji6+lB9aYM//hNrTGT2igyFvH4BCp+/G3bltbAtTTAN9emxTZAsWwTT1oLo188Dh4/7IKRwGWazSJF334CmToXrb0Nw+2ofnKY7N2eU7YI4jAXTToJZDazKCObNgZ3chmD+XK0LkaGS6BsGGhpO3EWQ96K+DDtrGkxdLYxuYPifSUAjP9Mg4+QdELmPSgx3vNuf27euQLDkGj9jR2pPevs0YJUJ6mO7ISen7EnwJd22SiHsjdfDSnG9BItyHAYH9b1MJPDdd9Cfa8e6gGM9QGsTWwIT0SXDAISILg5ZqMgQvXJZF7BawCsrNwk4Eofkvb3aglcXNWeKP2ya0iW1CLLgczHMvNkIv/o5FL6wDqaxUWcqmGxHoxCkQ+YKWlycHD6KcN0dsDVlzSyyyxZp8BI/8pRPzSqEfjfmmrkIblzmOxLVlWEWzQWeqwXkeeYdgMjxk2L041Jg/4imsIWf+4yfbSKzJ4JQF4jSNUkK+WGDM+8qSGBhDey82TB19b4b7c3LYVoaEf30Sb+wlToEONj5sxDeebMPPmTt2tIC5/yVd5Ol8MgieWgE5tr5KHz3j1G4aeXYU3AjQ8DICFBbCzt7FgrfeBD2mrmo/tMjcHsPpGk+OV9Rl5QlCexkl+v+dbAzpvhBglIPMlHWPUxenwbQJR99yO2kaYL83Tg/d0YCvzQ9TmfalMsIbloBM3+mP65P/R4YkB2WCa2mK1VfG7Vg1vhx1J8lmpr1oaYMxteDyPsjaXbB+ts1cAhuuE53GG1rCxAU/E3ryxh9bZM+Z1PyO1BKAn/57AwO6/sVrr8TweplMNMmw0jHuKAAVxnWz2xy5DiCw0d119INjiDZsgPxW5sB6b6W1SMREV1EDECI6MJl4xQaa2HbWtLFW7p4lYVzAL3C6o73wEyZdOarqrLICwMEaeAgi6DgU7cg/OydvlhYUrEKfuBg9eEntR2tXkmWAKR/ENFPn0B4643Asmvhenpgp0xFcNuNiB952l/9liFysiCWBaXsdrSXtBWu7AroIvNidpU6F1LoHBY0TSp5/R3g9pthZPig7DjIlPTRKhLpmJSkgd1HkeL/0UG9Op5s3AK7aqmv1SgVYW9eCfP6O0gOHIGpAUyxoAtkhwTGxT4gk+LqsTQcvxiX3QS7bCGKf/FHCG5aiUSCOOnktGMPohffgOvo0UVycMsq3REp3H2n7tpU/tcP4fYf9QMA87yiLoGazFGZ3u7Pu1KNX5SfHAHL65fuYtIqemjIt8bNblOtIpF6lpoa2MmTtMGCTj6vpoXnIyNwo6OwrW0o/vm3NM2v+n9+7NOtGut9ypMEMVKALmlWEu6lO4PJsQ59D3TAYd2JC33X2Q1TLiH8wmcQfu3zsNOn+4BwJA18ogENEqUVtQw2TGSnKYrHdm2kmF6CEQk6wm8+iPDOW/1jDwz4c7yYwEgxfrGMQOaaLJjtg1ypRbp5BZLeXiSvbIJxga+v4kYIEV1EDECI6MIYv9DTdY/sILQ1T4hIzHhnJVkQnc3iU66SD4wCDbUIPnsHgrtu1eJrfaBKWtys3aACJB3HEb/wpubfm1nT0jz7Iuzq5UBzAxBVfcmCpA1JdyZkef7OLyAlCNGJ1oEu1tz23X7Y34V06rog6XR0KYSXgvhkQiF9qQjX1Y34rS3pArJwmgLxCeTHkvIjgyHlrxK4SADW26/Bl0mL3yVI8AXO/vFkR0RTjYzxy3B536p+ER0+sB72plVwIz4oirftRvXvvof431/WSezJPbfDzJwMO38enASOa25AuGsfoh8/7rtrSdpXXvNa5PjJjpzsGMjifKzIfAI5BkkEd+iILtrj7TuR7HzfP8cwhJGUss5OoFyLYPECBEsWwMgcGzn+LhkLVGRWiAQE4V1rED3+PJykDKZpcPKnpl61t6WfjXSH4/AxnediChOK1+W4SH1PqYjwK59F4U++qo/t+vv9a9DjlqaDyXnd2gR79xq4nz/tA+raGv/cBwZhF81F4S//CFYC8aFBPee1cYEMZezuRdI/pEGNpND5tMkYprkZRhsMRGOPQ0R0sTEAIaILlHZEkrWctBiVIt9KutCTq/ByBVjSO2RBE57lPzlpVy350qJfXYAN+MdxLl18A8nWnUB3jy7uskBDbh8sXah/6vOQ4tqhISTHJwyTS9KuUpPbYJpb/PeHRhC/u9PXDMgC83JNo3bpc5SgS4IRl9bSyM5IZw/i51/3Hapk0Xim5yiNABpqgelT0lDFt0WWq+66QE53kSQA02Oo74+k/hxHIotjHSAZ6MJVrvKH93zaT2t3vkVvsmcfKhJ8bNwGO2MazDXzdM4GjnUDs2cBVac7AoX/8CVdVFd/8oTfaTjT7s3FIOeD1PnMn4XCuts0RUmCphPmnNTU6DkibXVloKXbvAtuYBhG6osmdgBLA6bo+w8juHUliv/lP8Jet8gHw9V4rIWxq44C7c0IPr8W8U+e8EF3uaSBjyz8jRyb7H5tgORYp9/NCyd0CxutIIljFL/1IAp/9EVNadPATT4LNUX/nkmwLEXsQ8O602SvX4T4sX/X1ystnaUlspzb4f2fRnD9Et050+CrpuR31zZvR7xpK6J33tPbl777bV8z1dWjwWey/zCwYy+MBKQ1Z5HqR0R0jhiAENEFc2kBraS5aCpUGpCkl8/TvHR3+jajE7m0ba50UzpwWAMDlOtOvGqeXrl3hzs0PQmyMHPpsDmZNi1XmrUjU6LF6u5QF9z7B9LnZPwV6dlTYaZOSh84RtLR5XP+bQ6L49Pysygkrz9YOFeDAlc5wzyOjxLFsLVl2JZWvQKu+xk2gJOZLMeOaw2N1iCUQ5gZU3xwIHp64Dq6YGxax5D4HRKzYA7slDaYaoRECqA3bdX3u/CthxCsuA521nSfeqQdm9LWv9JutqkJwb13Id6zD+7dnZLzdObWsRftkBqfWnSq0Se1NXBHOxD94BG4Dw7C1NXBnNABakJKlOwcya6ZdLmSoCKb0WImPI5L626itFV0dhdy7CRQaJ+cvg9pjYZ0cEvrcPT35ZhFEYK1qxHct9ZPYpfp+PKzmmKagjU6tgOo91tbAzttCmDDtLbFwfX2Ibh1lTYUkFopnS0j59KRY6h+/6eInnnFp/SlzQR8bY7V3SJ35Kh29tJGDEGQ324VEX2iMAAhogtj0jafgYFtatBFrKuODyGURZIMIcxSdk67mJmYglIuwQ0MIn72NSR33Yqwrc3nzA9WtEZi7MZjV7Mn/K7WOZT8oipyOl8hke5PO3br1V1NO5JF/rxFMNMn+0LgkYrWikinqI98jrlIX0M2F2Lic5GFtHy/chbpbBJkNdf7uovs8MgxGRmGk7oDaTc8dYofICgD+ubM8rsgcpR6+nTOyNjQQp3AXvI1EMUyXF+vFk4Hq5YiuHG5L2puqPXF2rITICl31Wh8ES7L2xlTYGdPQ7zxPR8YXvL/90kDJ2krq0XkOLFjmJ6bVbiOTn8s5HbSWSotvPcHLC1MH/GBQfj5tQi//RXd7dFUtonDDOUYSXMDqRk51AEXJf5ckwCyvVVbGyu5PznHwgCJzAA5fBxW6lO0JmlE63SKX7gHwby5cL29flywpN9J8PDU83Dv7oBdMAvhF9bDSktfBOl0/HTXUD57LY2wC+fC1jcgkU5X8pmJqqj+62Oo/ttvfPMGG8DU18Bet9CnZUUVvZ9k63bEG7ekKWiFS/0mEdEnFAMQIrpI3Icnd2f1BdJ+N5sWfborqif8mvEdfbp7Ufnf/4Ko7XGYZYsRrrtdU5OM9VfW3fBwukOQTUj3uyymvs5PmZar+/DdhLRVqfWzDuTKbzhvFuzsGf53pKORpBzp3IzLuQMyHkSZmtJ4AJLVZcgVa5+jdeb7kABAOi/J4L04bTsr99Pb57uL6drZt2mV1B47fzZQrNHb6jRwCSLkKn26yDYLZsNMavX3I/ddKMDMmu4X3fIejFa0g5K+35Iilz5/UyzB9fag+shTSF7a6GsN8mrJmw3s0x0Ge+JJJjtj/QOaRqa7NpryF48HsWkthhsY0t2ywtfvR/i5T/taIwk+Tm7lK/dv0rSqt7f5813aKA+PIFg0WwcQyiJf6y+kxkeOoaRMVX2nMt39KJUQrL5eBx3qc5UdjrZWTTWs/O3fI96xV2uU4jc3az1P4av3I5g9O01p88GSfCbCW1Zo8bk0CdD3AA7RS28ifmWj7mrJzBGdB1NXC7vkGk3zkvfQhEUNiNx7+3yQctmDcSK6WjEAIaILlF5pHnWAXAmXRZcUOSNNM4mrSDq7/ewKSTeZOFxPpVOnpeg68sW6ej+yUJXdi03bEe87gMJffgvmaw/4uobAatqX271f6yI07ctXWfu2q01N/uptdoVa/hz0aSi6yCyEsPPn+FoPKeyVlJmBAb9gvdxFt7qbFAJSOK35/ukVc6lf6O5Oa0TO8jnK/RSyq9jpolqK8dP5KVorEAawq66DaW0eqxGRgM0dPu6v4Jt0SnohXYy6ZDwwGhz2aUFZMb8Eh1KrMjwEd+CQfj/p8bMqokd+ozsvZmp7OsXeXfraAuMDDQ2GTtXqNk5Tmlx6nkhQIcGoHCM5LtcuQLBoPuxNyxDedqPONnFybmtx+Yc7abloVFP9nMy80cF+6X1OatMUN308l77H0rYY4/VMulMis0KuvcYfa5nJUVNC0t+H6DWZY7PBNxSYNEnrd9zOfVrA7iQAGa2OBwoS2MyYCjtnutaoyK6UpI9Jswb09fsdscR300JUBzNzutaBaNBZrWjqndauFIunH4RPRHSBGIAQ0QXyXXlMY50uLnXHIZtFkPgFl5nSjuD2G/0At9KpOkw5X2wrOfm7P4DbtgsolvTvMlsBfU2wksZSqvEdoLLp6lLsLIvF5sb0btI6E6kJQdppSFKsunr8FWZZXEp3rXvuQCCLbglidFZE4Bfr2ilLnnDt5bvyK4theXnSzUgCOFlQyiK1vx/JwcP+Nmc7TV6Ov7wX2UBI57SNr+vqBbp6kdQUEd62EuEffM6n+QwO6rwQLXru6gYaGv2xlPkjew5ogT7MxMdOj7cEnZLOdOQYksMdiN/eArdrry644z0fIH53B0KZHP7Qej8ccvf+NEXvPKfin43RKkxrE4LF82Dr63zh9sRhfxJoSbqUXPE/1OFT1mZNhVk0B7axAXbaZB38Z29YCtvSptPT9TyaOCE8G1AoQU5dHZKDhxC/u113fZDt0kljBA0IZvpzLetsJi2BZZdOJprLzlQS626SbZ+kQaPTzln1iF94FdFTv4OZ3OqDRnm8ONJdDDNX7rOCeO8+/z5nOxZhej5n/y27U9v3auAi83S0eF52V6a0IZCAUNvvVuCOdWhNjJ/Vw+5XRHTpMAAhovNnzNh0cS1mlQDC99/1i+hR3zI3vOMWrRfQ3YuTC9HTPHsjdR4NDYg3bEL0/Z8heXsLMJj4GQqSzz53xvjtM7JDcKqUqeyKtgQgg0NwkR8EJ8MG7U1LUfjGF4Cp7b6zlvx+sQi7YC7MvFlwUviug/4u0wIslkWj9d2/dIHq02jc0KivVdBF/xmem9RhyAwIadHa3uYDwiw9To6NFFJPb0d4w3UIH1iHYOF8H9jpjJQK7PIlsHfdArdp+9hVewlazPBw9gA+TpPdFWMQb96O6LWNGnQk7+5AsuN9oL3Ft54t16Cw7g4Ea1YCLQ2I33gH7r29/hyRgu9LFefJ1PiwCKfTvJMPdwxzafvoUgizfDFscyPsdQt0Or5dvABm2hTtAiVdsqTmJWtRPCYLZmROjZz1e/YievI5JO9s94GV/EwW+TOmIJA6i7oGX9Mh3awmHENtyYt0Wrqcz1KXJM9ZhnrKj/cfhnvtHV+vJHNKevpgZ05BIIFRcyvijuNIfv+6v70EkTr/I61dCYLxQvnmBjg53hJgSDvehXMRPrjO735oI4ISku3HkOzd7+ugrOX2BxFdMgxAiOj8pUGG0dz1FbAtzb4r1cT1cXql27Q0n/5h0phFCqSDG1bANjeh8nf/gPilt7Ruo/DHX0Jw9+1w/X3jk55PV0eSdT3SWR9VmHI9gpuW69VuO3s6wjtuhq2r98W5sqiUdJhyWa90y9A3aRULKUyWq+Z507SgSJ+TbUmnXWev1+Ck3YcziBJAUtNam/z9pjULZu5s2HW3I7z3Lp1ZYadOQSKPKcGh7AgNDCJYuQyFrw6j8v5+oCdNpwtDxJ3dsNURn9Yl95fWcsRvvoP48Wf9BHCZ/bHiWpglc2HmTEe4ahnsqmWId+5G9f/9GO6dHX7H4FJ3wZK0q44uJK+9i+TzR2Ck3kcC4ix9SlrY1tYi/NJ9CB68RwM1qbvRKfPyc0lJkttmwRsmnHNSVyHpfLLglzkhGzaj+oOHNU1Kdl00fUnuRwLr6ZNhWhvHdgqzXR/diZIdp2zoZZoSpvNX5Ly0vtpHJ6hfM1trUXSeTqmA4OsP6OdBn9K+g0heftvXnEjr5krkd6KOdsAuXKDBhTyP4MH1GgAmm3fA1Ncj+OI6hPKZ0qYMUn/SgER2HyU4LKSDOhl/ENElwgCEiM5f4jR1xE5rQ3jbau2+g+6+D6dvyGLs5CvIzk3oZpVeUc7avc6fi+JffQeVnj4kEjgsvx6mXAfX1TWh01VxLK1oTJbXL4tp+FQZnQUxZzqKc2b4wvZiEU7akmbdiHS4W+LvT9K8iqF2MLo80td2qjkZUkAfhmeXl69F7EVdUCbbdyFYKQFYpHn/tqkRxfvX+VkXsuCOfd2DBjc27eYkDycdzZoatFhZH1vqcV7fBCc7BMuWwPX0jwWb0pEp+MztwPAwTE2NzoLR4ETb2hYQv/oGqt/7Cdy7u/xxzqm4WTuyHTyE6hPPovjtrwIymVxqOOSxI99wQOqArOw0pSlT/hyMT9wxydKu5PjroMh0PsrxLkS/+R2qMmH/wFEftGa7DnI76XS1aTuSd97TSfR6HKPIBxaywyXHIm1RrYGIpErt2AOsul6DIxk+GKxejuJ/+y6iJ55DMjKKcO0tKNyzFkFdPaL9+1F59Em4vsGx6f7SljfZuBnR7+eguHihD8YrVYR3rka4ZpXv6CXHv5wOLNSGAiFcV6cOYNTidGk2wAwsIrqEGIAQ0fmTq+AyaXrRXK3VMFldcWDHAgCVLeayNCBrxmeEZOR7MstAinGrFU2JCu5fi+Sff4Hk7XeRLJ7jr8xKcCGD86yfaTF+dRq6oDNDw4hffQtW6lEaG3xBsdxOJ0SHSA4fQvVnjyNYfj1CSTNKJ37L4s021fv8e7lyfTkKcCWdrbFBO37prIl4Qqelgk0X7jjz4jBt4+sOHkX89jaEDwz4HYEkrS+RxafsrgwNovrMixqoFL/5RV14ao2GvPSaIkx7q06Zz7qSSRelaM5MFKSjk9yHXF13DnZyGzB92oQDNv4Eqw//CtGPf4lk/zHfLrammM+BlfNC2ssODiF+4nlE0yejcN/dQHOTnyqenYsyZLHqB1yOPSuD8R0aPTdM2tGroM0NZIZK5TfPwz3zqg511FbGUoAv51icjKc9aWvdCpI9H+gwTiP1HWmbX9mZGNttcf49kXTB+PdvIFx/F0xjE+CG9X7DW1dp0wT9SE1uhy2WEB85gsq/Pobk+Td9MCmfjWxIZ08/kt+/gXj5EhRuXe0bnrlYZ5GMdVCrjPrPRhBqql+8dQeSQ0fT4nNGH0R0aQV/vWD538g/fTzORHTOJFgolWCXLUK4aqmf0eEMjCzGpLVpqeS/Cn6Rows5uUJcKPqFklz1zb6kjkRuW/QLKStXbpvqET39IuI3NiG47hrYeXP8FfTePlS/9yPEr23095vtGMj9DI8ikXal09oRXDPfX5FPF1XJzt2I/uERbQkrRdOS7x9kRb+79upAumTXB/45nM3QxIvJpB2MwgDB8kUIb7vJHyvZtWhohDt4GNFjv/FdijRF5gyLRGt08KA70uGH2y2/1g/aK/qZGO7wEb2qXvm/P9LjJUX5wayZOh1cit2jf3sSycbt6XMKdc6LDIWUImU7rR3hkkX63stOiyzgTZS2kzX+vYh37ET0g58h+vGvgI4ev/jVeowco7ps56JnwKcXVUY0qLKTWmFqymlg4Xc2JP1K55jIuSSBrHxJq+FyrZ4PrrML0WtvI370aUS/fAbxC29oupKR3TLZWZFjdHKdiQRu8rsD/brTEiy9dqyDmLyfycsbtJuUwukAAA57SURBVABe6i80TVF+X2bWdHbDzGiHnTHDf6bCgu5c2cZGTUOU51H90S+QPPeaf/xsNyMjvyMTz7fv1s5cVorrG5rGAkNtX93d7T87pYK+zvil1xG/tgmmmhazMwghoktnhDsgRHT+5Mpr/wDctt3aqlUWS8lAF+LNW4GjnUgOHtOFjF22UNN55OqzpjhpK9OTUrJkYSZTn2VR1NyI+INDSOR+5YrxBwcR/eoZ2EXzgMYmRK9vQPVffwEjV5Dl6nsUj9+HLJ6OdCL6l19o+oudOdWnih3r1CvRyZtbYJxB8s4OVH/wMyS33qCtfONXNiB+4S2/GJTFclZ7kRfnW7lKwJG8tQXR628jvOs22LZJSEaHET3zkhZ4jwVsZ1rIZzsA0gZXApeBIdgZU33BdFefTplPtuwEjnZp6lz1X34Bd+S4BjfRS2/p8TCyI1OXLtTlon5Lky6Qq4/9Vq/s64yMSc2+vkLSmCQ4kvOhs1uDw1iKp+U9li5lBh9eoF9q2Y6APIdDHYh+9mufDnX9YthFc/zEfKmbkICgs1uPvb7GIb87IOdssu8g3PsHtbVusvcDDVAlENPhi9JBKpsUf6r3Q3eiCsDBY4h/9yrsiiV+4KME4Daty8kmrkstUtF3b4ufe0WfT3DzCh3gqClSUs+y76CmkMlnQ56TTpxvqhvvxpUpBvqcpBlA9YePIdm6B2b6FP8YmnIojR3maHMIDbqCEIl0OTtw1L8uBh9EdImZjvXf7ALQwgNNROdMujN192o+e+Gv/lgHAMoU5WTXPjiZfv7+QU1/squX6pwJGaqmHZikDawWA6cPaHz6les4rostuW0srXglrz5MZ2E01SO44yaYaZN1R8S9u9P/brpjMiZbPPUPaaqNaW7wC7jjvZr+owssCVKkJa/sEtSW/IJThsJJ4CEzGJLLVH2rxeIVv+uwZL4fNLdwLqIX30AkuzadvX4X4Vx2Z2xaUC0Laq2bSV+7PJwUUmvhctWnEbU3+yLm3kHd2TjlvI60QFqL5aVz1uzpCK6Zo+9TcuS4nywuczCGK37uysnvz+WQnRMjo3C9A37exoJZsLNm6AR3ORdlgKAcF9veAjcwjKRvQDtjJTv2wH1wxO+S1NX4VrjBOdawSIeq+hqYFYthl14LO6Vda3Oi51/VQF13MCaeA1KMLsX/ofET1CV97mgn3PF0Wr+cs/I9pN2zdJbISQM+sxRCCRR1Mn06K0TeZ6mJ+pOvIPyDB3wAMjqC0f/6PxD99iU/r4TDB4no0upmAEJEFyZJC25rCv4K+Eg0llLir9RjrFZgrB2s1GScfJFVrrAHvvWnixNflJumZumCWydtD/t5DXL/snA+3VR1ZO19s+nWaV1KWusx9oBRMt5yV2eB5Jx2dSq6AI3hBkdhmmTwXBNcVz8gC+e6M7zm00lbHY/tQGTHYWIgI7fRTlluvAvSKZ9f2mEr6yYVSw2F3y3SwYWBHT+WeaexnQ2djZL42o84HmvZbKxvPevS16WvRW5byFL80nSt89kdMH7ivHSxkm5s+vjFgg8AJXg41c6QS59nlH5msmOapZXJKVCTztSRDllZmtypTo0k7fYmn62ePgQ3XI/Cn34N9oblWrcV79mLyt9+H8mGbb6LFxHRpdXNFCwiujBZJ6reAZ+v3pheIbYTFvvpollTRsaqqE8RgWSpPlkXoez3045VegU6mxSO07TizZhszoQdnx5+wu3TblNji7qPydiDdJiiqUt3ZvYd8ceztmb85+cq66zlJnTXOvn1ZulCwEcvsrNgLgvWZGZJkLWnNePF1x9XQTrjIqu5yLqxpTU1xhXG59icfB6e785Adj/ltMYpC9oDe/q0NA3SgwmfITte95OkLaant2v6nNu6K72f8NQnsT5Wev5LcN/aAjt7hh+AKAMkO3v89PMClwRElA/+a0NEFyZblKVD6U64Qp/9GUwIRMacZo4HcOqr/GlnpxPu90yy25mP+J2JU60/TmThWUoXq+d75f1Ux/Z0f8cZAo/TkYVxNtz+SsjcmTDLY6xbG046V3DSa7lYKUm6qzfhAT7qbic+z5O/l93F8AjMvBkwy5f4hgHSYrdc/HAKocsez3f90t1D6XimdTmxnwOSd80TEX2ifYwvUxHRFcWctJA7lbFp3NkU6JO+Mh+VVvVJyk/XoYr2418U7K6Q4ONk7hTn1KV8LRMf70IfQ2puDndoO+pg3W26wyVtfE/bo9ml9SJN9TBTWrUDlsleqzSFGK36tDMiohzwXxsiIqIrjaRcSRrZlMkIbloBe9dq3zQgm2vzIb4DlhS1mznT0toSHxBJVzonzQ8YgBBRTvivDRER0ZXGph3Tunph29tR+Pr9sPNn6oDDD6VgmbTNb6UKO3Oadv9yUZTWoDjEW3bDdXSPF7UTEV1iDECIiIiuNNJUoJIgfn0T4mdfQDBnFsKvfx5GWil39pxYtA6trtdWzLatFXbuTB1O6YvjQz+ssm9gfKAnEdElxgCEiIjoSiNd5dqadKBk5Z8f8d2T71+P8AvrgLoauGMdwMCIDgj17ZKdnx0yexogU9Fd2qK6p298CCbnfxBRTtgFi4iI6EokKVRRAtfRg6SzC8HUqSh85ytwYYj44SeBoQrcyIh2y5LuV8H6NQhuWQHEFe1a50ZGEW96F65/AKipuTIbCRDRFYk7IERERFeixME01AFDI6g+8gTi/j7YYhnFrz2A4n//zwjuvR12ZrtO9zfNDQjXrkGwcJ4fXKicHxIqBe0f80ZrRHR14Q4IERHRlUhSpmpr4Xr7ED/yW4Qrl8LdfgtMfS2CVUsRzJsJ19UL19GlHbLMvNlwI1XfDats4ST2ON6jxekmYP0HEeWHAQgREdGVyjiY2jJc/yBG/+cPUZK68jtWa8qVaWmGaW8HFiValO6GhoDhYf9Ci0W4Q8cQv/AmXHef30lhDQgR5YQBCBER0ZVK2uuWCjCmHsmWnah8/2colUsw1y+Gk5kgUf/4fA8JMOS/rYUbHUWydx/cgSNAnPgOWPInEVEOWANCRER0JUsDCDtlEpINm1H5yS/hjnbANjXBNDUCNvB1HhKARJH/Xu8Aoqd+D5ckMHW1/udERDnhDggREdGVTnc3DExDA5KdHyB69Glg7Rpg1jSY+jqYYv3Y9HSpOI+37kTy8tswErww/YqIcsYAhIiI6Gog8z7qanR3o/rTJxG/vAHBZ25BcP0S2AWzgXItTE0N4i3bED32JBDHQKnEt56Icmc61n+zC0ALDz0REdEVzqR1Idpe1/najmIBZuYUmNnTdRck2bYbOHJcO2NBul9x94OI8tXNHRAiIqKrhUt3QmprfG3I8KgOIkwGh4D3D+n0cxPFQFjwBekMPojoMmAAQkREdLWR3Q9NySoDpuyDDvmS75XLWi/C4IOILhcGIERERFcrCTIkzpDdjuKExpcMPojoMmIbXiIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyg0DECIiIiIiyk0IoAdAPYABHnYiIiIiIrpE6gH0/H/aWz1fsxsaNQAAAABJRU5ErkJggg==随后将图片编码设置在image属性中:<div id="sakana-widget" style="position:fixed;right:0px;bottom:0px;"></div> <script> function initSakanaWidget() { const demo = SakanaWidget.getCharacter('chisato'); demo.initialState = { ...demo.initialState, i: 0.001, d: 1, autoFit:true, demo.image = `data:image/png;base64,图片编码`; SakanaWidget.registerCharacter('demo', demo); new SakanaWidget({ character: 'demo' }).mount('#sakana-widget'); </script> <script async onload="initSakanaWidget()" src="https://cdn.jsdelivr.net/npm/sakana-widget@2.3.1/lib/sakana.min.js" ></script>这里注意图片头部需要声明图片后缀以及编码:data:image/png;base64,功能优化官方版本在默认情况下会触发一个异常:Links do not have a discernible name简单理解就是挂件中的Github链接没有一个浏览器可以辨认的描述符,从而让这个超链接变得可疑。我们可以针对这个节点元素人为地为其加上标签描述符:$(".sakana-widget-ctrl a").attr("aria-label","Github");与此同时,如果不希望链接直接跳转官方的Github首页,也可以人为地更改其地址:$(".sakana-widget-ctrl a").attr("href","//github.com/zcxey2911");性能优化通常情况下,这一类特效挂件都会增加一些带宽成本,为了不影响首屏响应速度,我们可以采取异步的加载方式:<script async onload="initSakanaWidget()" src="https://cdn.jsdelivr.net/npm/sakana-widget@2.3.1/lib/sakana.min.js" ></script>但事实上,script标签存在两个属性,defer和async,加上两个属性之后,在js真正执行之前都不会阻止html的加载。因此script标签的使用分为几种不同的情况:没有defer或async属性,浏览器会立即加载并执行相应的脚本。也就是说在渲染script标签之后的文档之前,不等待后续加载的文档元素,读到就开始加载和执行,此举会阻塞后续文档的加载;有了async属性,表示后续文档的加载和渲染与js脚本的加载和执行是并行进行的,即异步执行;有了defer属性,加载后续文档的过程和js脚本的加载(此时仅加载不执行)是并行进行的(异步),js脚本的执行需要等到文档所有元素解析完成之后,DOMContentLoaded事件触发执行之前。defer和async在网络加载过程是一致的,都是异步执行的,二者的区别在于脚本加载完成之后何时执行,可以看出defer更符合大多数场景对应用脚本加载和执行的要求:<script defer onload="initSakanaWidget()" src="https://cdn.jsdelivr.net/npm/sakana-widget@2.3.1/lib/sakana.min.js" ></script>结语香令人幽,酒令人远,石令人隽,琴令人寂,得一挂件,清赏把玩,足以借境调心。

巧如范金,精比琢玉,一分钟高效打造精美详实的Go语言技术简历(Golang1.18)

研发少闲月,二月人倍忙。又到了一年一度的“金三银四”春招季,又到了写简历的时节,如果你还在用传统的Word文档寻找模板,然后默默耕耘,显然就有些落后于时代了,本次我们尝试使用云平台flowcv高效打造一份巧如范金、精比琢玉的高品质Golang技术简历。首先来到云平台:flowcv.com 点击 try free 然后选择 Resume 点击创建新简历:一份合格的技术简历大抵包含六大部分:个人信息(Information)、个人简介(Profile)、工作经历(Professional Experience)、学历信息(Education)、项目经验(Professional Experience)以及技能列表(Skills)。个人信息(Information)个人信息指的是求职者的基本信息,如名字、年龄以及联系方式。但事实上,真正必要的就是名字、求职岗位Title、邮箱、手机号以及所在地:名字:某某 邮箱:123@gmail.com 岗位: Go lang Develper 手机: 133-3212-3212 Base: 北京简单扼要,直击要害。个人简介(Profile)个人简介是对求职者经历的一个简单描述,内容不必过多,但通过简单地描摹,可以让简历筛查人员在短时间内判断求职者与对应岗位是否匹配:Web开发领域深耕三年,热爱编程,熟练掌握Golang开发语言,掌握关系型数据库和非关系型数据库,掌握Golang高性能框架Iris,能够在很短时间内独立开发项目。非常注重自我学习和提升,能够胜任高强度高压力的繁杂工作。希望能和贵公司一起成长。这里首先展示工作年限,然后表明擅长语言与数据库,随后突出使用的框架,最后强调独立开发能力与抗压能力,这些都是研发人员所需要具备的基本素质。工作经历(Professional Experience)工作经历就是求职者过往的研发经历,一般情况下需要列出公司名称、任职时间、岗位名称和实际工作内容:公司二 2019-2020 Golang开发 任职于海外电商核心交易订单组,主要是做印度,港台,西欧,俄罗斯这几个 市场的项目开发。期间主要做订金预售,企业购等大型项目的开发以及负责购物车整体的架构重构。 公司一 2017-2019 Golang开发 参与公司里多个的项目的后端开发,负责后端服务的架构设计、开发以及维护 ,构建高并发高性能的后端服务,并进行优化做技术调研。在公司期间参与的 项目:某某小程序。能够敏捷开发,配合产品以及组内成员之间完成接口的调试。这里需要注意的是,最近的工作经历要在上面进行展示,而比较久远的经历在下面展示,因为招聘者关心的其实是求职者最近的工作经历。学历信息(Education)学历信息除了毕业院校、毕业时间以及专业以外,还可以把主修和选修课写上:某某大学 计算机科学与技术 2016-2019 计算机组成原理、计算机系统结构、操作系统、汇编语言程序设计、高级语言程序设计、计算机网络、数据库原理及应用、软件工程等项目经验(Professional Experience)项目经验是一份技术简历的核心,面试过程中,招聘者和求职者所沟通的重点往往也在过往的项目经历中:项目一 某平台项目 2020-2022 项目平台主要涉及到有印度,新加坡,西欧,俄罗斯四个机房。总共13个国 家小米网站点10个国家的POCO站点。用户数达千万级,业务高峰时并发量6 0w,印度市场日订单量达十万。任职期间主要负责的模块有购物车模块、算 价模块、下单模块、订单查询模块。原有订单系统功能的迭代,参与双十一活 动,黑五订金预售等活动的开发,基于项目的高可用可拓展在业务架构, 系统架构,技术架构三个层面对订单购物车模块进行了重构。对业务中分布式 事务的一致性做了进一步处理,用grpc调用替代原有大量缓存的混乱使用,对业务和业务之间的耦合进行了拆分进一步实现高内聚低耦合。 项目二 某公司项目 2019-2020 基于高性能框架Iris实现Restful风格的在线聚合支付接口,聚合封装了支付宝,微信,京东等三方支付平台 Hash取模算法设计分表逻辑,负载均衡 独立设计基于redis异步任务队列风控审核架构,同时配置自动化循环队列任务(有序集合) 利用Websocket实现后端消息主动推送,改造前端传统轮询技术框架,减少了30%的网络请求数,节约了大约一半的可用带宽 使用Redis集群作为缓存介质,缓解数据库压力。 利用Docker进行服务封装和业务解耦,使用Docker-compose批量管理容器集群,用Dockerfile编写部署脚本 Nginx反向代理Tornado,采用加权策略的负载均衡技术,后台服务统一使用SuperVisor进行管理 利用百度AI对用户投诉及聊天记录信息进行模糊匹配与情感分析,预测用户导向。 后期使用Thrift框架RPC协议架构对传统的http接口进行重构,提高了整体接口的性能和吞吐量。 使用Redisearch打造全文检索引擎,百万级数据可以达到单次检索10毫秒以内的速度。 开发,测试用户认证,订单,支付/退款等7个模块大体上,遵循“做了什么和得到了什么”原则,强调项目结果,但也重视项目过程。技能列表(Skills)顾名思义,技能列表即求职者所掌握的技术栈,一些和岗位不相关和过时的技术栈可以略过不写:后端框架:Iris/Grpc 前端框架:Vue3.0 数据库:MySQL,Redis 工具:Docker,Git,SuperVisor 其他:Websocket,百度BCC、自然语言分析 外语:CET6,能流畅阅读英文文档模板选择当我们填写好简历的六大核心部分,就可以选择一块称心如意的模板了:模板会根据简历内容自适应,同时也支持布局、字体、ICON等细节的调整:调整完毕之后,点击下载按钮,就可以得到一份pdf格式的简历。PDF和Html的简历格式转换美中不足的是,flowcv平台并不支持中文字体,但是没关系,我们可以通过技术手段“曲线救国”。这里使用Golang1.18针对PDF文件做转换操作,首先安装转换包sdk:go get github.com/pdfcrowd/pdfcrowd-go这里使用pdfcrowd-go包,随后编写转换脚本pdftohtml.go:package main import ( "fmt" "github.com/pdfcrowd/pdfcrowd-go" func main() { client := pdfcrowd.NewPdfToHtmlClient("demo", "ce544b6ea52a5621fb9d55f8b542d14d") err := client.ConvertFileToFile("test.pdf", "test.html") handleError(err) func handleError(err error) { if err != nil { // report the error why, ok := err.(pdfcrowd.Error) if ok { os.Stderr.WriteString(fmt.Sprintf("Pdfcrowd Error: %s\n", why)) } else { os.Stderr.WriteString(fmt.Sprintf("Generic Error: %s\n", err)) panic(err.Error()) fmt.Println("转换完毕") }这里使用pdfcrowd平台的测试账号demo创建客户端结构体,然后使用client.ConvertFileToFile函数进行转换操作,将脚本所在目录的test.pdf文件转换为test.html文件。程序返回:➜ mydemo git:(master) ✗ go run "/Users/liuyue/wodfan/work/mydemo/mypdf.go" 转换完毕藉此,我们就得到了一份Html格式的简历:<!DOCTYPE html> <!-- Created by Pdfcrowd (https://pdfcrowd.com/) --><html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta charset="utf-8"> <meta name="keywords" content="Free Online Resume Builder, FlowCV - https://flowcv.io"> <style type="text/css"> #page-container{top:0;left:0;margin:0;padding:0;border:0}@media screen{#page-container{bottom:0;right:0;overflow:auto}}@media print{@page{margin:0}html{margin:0}body{margin:0;-webkit-print-color-adjust:exact}#page-container{width:auto;height:auto;overflow:visible;background:none !important}.d{}}.page{position:relative;background-color:white;overflow:hidden;margin:0;border:0}.page-content{border:0;padding:0;margin:0;top:0;left:0;width:100%;height:100%;overflow:hidden;display:block;transform-origin:0 0;-ms-transform-origin:0 0;-webkit-transform-origin:0 0}.page-content.opened{display:block}.bf{border:0;margin:0;top:0;bottom:0;width:100%;height:100%;-ms-user-select:none;-moz-user-select:none;-webkit-user-select:none;user-select:none}.bi{border:0;margin:0;-ms-user-select:none;-moz-user-select:none;-webkit-user-select:none;user-select:none}@media print{.page{margin:0;box-shadow:none;page-break-after:always;page-break-inside:avoid}@-moz-document url-prefix(){.page{overflow:visible;border:1px solid #fff}.page-content{overflow:visible}}}.c{border:0;padding:0;margin:0;overflow:hidden;display:block}.t{white-space:pre;font-size:1px;transform-origin:0 100%;-ms-transform-origin:0 100%;-webkit-transform-origin:0 100%;unicode-bidi:bidi-override;-moz-font-feature-settings:"liga" 0}.t:after{content:''}.t:before{content:'';display:inline-block}.t span{position:relative;unicode-bidi:bidi-override}._{display:inline-block;color:transparent;z-index:-1}::selection{background:rgba(127,255,255,0.4)}::-moz-selection{background:rgba(127,255,255,0.4)}.pi{}.d{transform-origin:0 100%;-ms-transform-origin:0 100%;-webkit-transform-origin:0 100%}.it{border:0;background-color:rgba(255,255,255,0.0)}.ir:hover{cursor:pointer}</style> <style type="text/css"> @media screen { #page-container { background-color: #9e9e9e; .page { margin: 13px auto; border-collapse: separate; </style><style type="text/css">随后就可以通过Html标签以及Css样式来控制中文字体或者其他样式了,最终效果如下:结语毫无疑问,对于简历来说,内容大于形式,因为内容是事物存在的基础,但同一类内容不应该只能有一种形式,内容是简历内一切内在要素的总和,而形式是这些内在要素的结构和组织方式,简历中的内容和形式应该是辩证统一的关系,形式服从内容,并随内容的变化而变化,形式对简历内容又有反作用,形式适合内容,就促进内容的发展,形式不适合内容,则阻碍内容的发展。

彩虹女神跃长空,Go语言进阶之Go语言高性能Web框架Iris项目实战-JWT和中间件(Middleware)的使用EP07

前文再续,上一回我们完成了用户的登录逻辑,将之前用户管理模块中添加的用户账号进行账号和密码的校验,过程中使用图形验证码强制进行人机交互,防止账号的密码被暴力破解。本回我们需要为登录成功的用户生成Token,并且通过Iris的中间件(Middleware)进行鉴权操作。Iris模板复用在生成Token之前,首先我们需要对项目的模板进行优化改造,目前存在的页面模板有三块,分别是:首页模板(index.html)、登录页模板(signin.html)、后台用户管理页模板(/admin/user.html),虽然页面并不多,但不难发现,有很多重复的代码,比方说,首页模板和登录页模板都有公共的头部导航菜单,没必要每个模板都写一遍相同的代码,再比如,三块模板都会有axios的封装逻辑,也没必要三块模板封装三次,除此之外,以后模板多了,不做复用,就会出现维护困难的问题。首先提取页面模板的公共部分,比如头部导航,在views目录建立header.html:<nav class="navbar navbar-inverse navbar-fixed-top"> <div class="container"> <div class="navbar-header"> <div class="switch_a nav_swich"> <div class="react-toggle"> <div class="react-toggle-track"><div class="react-toggle-track-check"><img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAgCAYAAABzenr0AAAAAXNSR0IArs4c6QAAAAlwSFlzAAALEwAACxMBAJqcGAAAAVlpVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IlhNUCBDb3JlIDUuNC4wIj4KICAgPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4KICAgICAgPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIKICAgICAgICAgICAgeG1sbnM6dGlmZj0iaHR0cDovL25zLmFkb2JlLmNvbS90aWZmLzEuMC8iPgogICAgICAgICA8dGlmZjpPcmllbnRhdGlvbj4xPC90aWZmOk9yaWVudGF0aW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4KTMInWQAABwNJREFUWAmtV1tsFFUY/s6Z2d22zLYlZakUCRVaQcqlWIiCiS1gTEB9UAO+GR9En3iQGI0xJiSiRB98MjEq8cEQTSBeHhQM0V7whtEGDWC90BYitxahtNtu25058/v/ZzvLbilawJNM5+yZ89+//1LgJhYRNLW1uDfBAvpGiIk2O5auvfFxqIH3ZJ8/u06GN6Z9+wVl5SjcD1IbZa/UPkPyYl2uR4dreoD2bnbYxTlBBRytkHXtAREphP5KuH4lddx9h70yxX05t7yYXwGb6W8nx1jibpl2rFlGBxcG9M18okOrn7Bnk/BAO/4bI0UeEE1zjBp3UmvjOxJXJdaKN/ZiIu4tOZrAb4aTdZAZArKmWeiiJZ6jt5tiagdCS9+6cgO1Ne6Mvhe+ixTIfyDVhipnK9p+P0Edqx9RW/YZtQVGmOLChRxNNlyPsTEgPQKMB3dbEHa0h1awYmQ83enTd2vmUtvKd1Glv2RkzBb+kZGRrKtjzG60Wguhd/lJZBingbcfWWe72vjT75bJDrhYtvA0hrurETDr5HyF2Knb1MM4ab//xIoOqueA0edRnkkinTyJdYvqLFDZO4zUPFCvVoDjJq4T7TE61IWh4x5KqxX5KVKkX8WZ/t2ov2cb3MHt4dhIyOxIJxJOOF6xRx/99BksXLoecWcXytILMNBDqKpnGZWPquYfPxY8iXGR9fK+SgFrgcRPXPjVqhehL+3EmZ5RGJQi1QBU8TPThQnOQzm+5UXGIcetUeEAfP13VwzpI+w1jGJWdSliNfvVhiMPiOsllJag4M/UGHiqM6dlBb2OTLKHHV6KkvogrJ4XhBWniWK/Gp1MQyf93FOeUXKmKk/FzJxbQtKLjFXYT4USupy8fQVir2ynVEBiZMG0qtOHMS/AW4Gwrk7BG3C1F0B5nqNKE0CME4MfVRLPnXkBKe+ipvoFhNQywOhdghvLi0F8ReyVXV4BKTBRbbe5f64zR/DHsdZw1hJfeWlHl/GNRJzDxrd5m192z78TMaVnKELZoINZS4BzQ7vtnZljSnha/pPCbkuxzXcupYwI5tIeCpGc0Yp9tWHZQy/rmYhRfNgg4bHJBYLzGkxsRJF4XKlE2jBOHNSv3kY7Tj6vthzPFl61BrYwqFlmEQhtSVXmLiksxLmtRgYXI1ULU61JJ4eVKmG3/5sCVgpbMT6OMJ2E08/29Xf3w6v4FnHdCjfWgXu/O8Z5mLdCkeRs2khHe1DqOtQwbHWTAnM5S2HNmhALYo5KjkPFrMMKjZl6HxhWIAb0BqE+/73GrBRQUsKYiBu4JX8ycI6wtw+i5ef3NZpsrKVSHYCP37jwGDgeE1SA0S/xtl5SU2fs1ApEp0qTLVRjgyycDSsLHMSwmFltZMStR3uLLg6BdLhDa5dC6ryU2pHBe1BVO9tUcwfitJt2CLJZUHoG6T7Op75u0IyK31TCPcwFqgPk/KCaD3dFOuZBCO7xvCT/j048b3I3c7F2+WuOW7qdgkucFYlcQ4qop3yzTX7WaKfOCccye3Ts1Etq0+a/BHCF1yPgF3tAUkR6OrtGmo6gl94qqcXKh3rDyrOkPa58URoWcov2Mo6M+0QjrqKB+b7++oMa9Sz+ZkM0mie6aAtnGUvhmxaI+TogPOSQedgWioGSHFLn3v4kLh4HRspNmOGv41k+55siLFp2z6xYeJjhljFcbmxJlr4ga06TbevSByz/glQq4BJx46/c+237PbBqEYKxX3HpmKZEnQnr65X20hqJYaNcLoFOLiJk2LuBbyg7Q0OEn+hm0P3honxFD6rdxYorKpeIoi4YSSvyQHQIbM5t4+YNxLj/OxhVOOE4585qGpjnq+wSx6Q9CtNxTjd5klB+g6Mv36r0+b9cZFi44WYkHdG2ZWb3TtOUOXyVAlKlpGvJIAJ3eBMyfYS5C0qRZGtC85j+4sOasDe9xznPYezhhO/2Q6eP2fSOvYHOjtuQ1a9Q1VKynVDaMc8E0tptdxUsTFpFIYjcZKcbnoaQTNdiqCwNlL4G7oziSqGnT1ALf34vhk4R5zU3qYV9ONp9K88RtouShE68JwaU8dFw5W617shWa9ykeaBIn2hcsvPgL00k45QdTCZuSVcTRNs+8fnyLvooQfR5iujAnR9bxfY2xOVOxFS8SK3Le0l48VyYu1M8HRe5JD8wKPTjYnifaK3Wfn/GChYQ8ZAi6WRzWgqLV5YrsVLnZaVSoXU1g9gOIDwFySiGi+Zdrnzr7J3r+SMuszlcQCRn8lNGcTuSy2jOI7o9mxjZo+vR3ej3tN+ifRSOyUTS0+VMOid93cCubeiy/6TImS0QxRSCq2vxKr45zV+FQnjWH6D2xg+E9EatLcLAdHTgtGGD80D6jM0+aOl4wJgO/f96R2aJKCQ3yvgftRhdFMOpd6oAAAAASUVORK5CYII=" width="16" height="16" role="presentation" style="pointer-events: none;"></div> <div class="react-toggle-track-x"><img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAgCAYAAABzenr0AAAAAXNSR0IArs4c6QAAAAlwSFlzAAALEwAACxMBAJqcGAAAAVlpVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IlhNUCBDb3JlIDUuNC4wIj4KICAgPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4KICAgICAgPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIKICAgICAgICAgICAgeG1sbnM6dGlmZj0iaHR0cDovL25zLmFkb2JlLmNvbS90aWZmLzEuMC8iPgogICAgICAgICA8dGlmZjpPcmllbnRhdGlvbj4xPC90aWZmOk9yaWVudGF0aW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4KTMInWQAABlJJREFUWAm1V3tsFEUcntnXvXu0tBWo1ZZHihBjCEWqkHiNaMLDRKOtQSKaiCFKQtS/SbxiFCHGCIkmkBSMwZhQNTFoQZD0DFiwtCDFAkdDqBBBKFj63rvdnfH7zfVo5aFBj0l2Z/dm5vd98/0es8dYjlpr62azufnDQNZcU1PciMfjWvb9rvZSMk4Ayfb36pLH13189GC8LAtIRLLPt+pzwrCuLq4ISEv/gHmitrAwfPbEkXc/ad4dL6iujrvyX0jcitgd/yZlZqftP6995Mr5TVLa22Tn8XVX2g/XLSRjUu7Q79jonS7I7hS7/0oOb5VyqF52n98oj7esXX07EjlxwXWisRmSnm3b29TTM8iYrjmFBWExubxwY/uhNas4r/WySl1fc5cetDMd7ydl+lMJJRw5WC8ud62Xx5rfepzwxgZmbhUYNS5Stvsj4yo2GXJEFBVHWDBkfdbR9HpYBaaUajDnBLKKpl1xRKYcgGtMCqEzTaSnThk/SQT0uJqTqFNBmXMCsZE48DzRZRMBRjv1GHNdk3HBImF9ZUvTyxM40pMKVc4JZBXQOLOFoDeKSxdp6HIQcO4rjYT9fn0pjbz9GLt7BAAODmjSVReXUMFzNW5x5vfxp2mIxZjIuQKJxAmFa+is2DQJJQ0JyBVExNOYcJnPxx/6/utnijmP555ALEagKAGGnGn64QORBjARcIA/yJk7JMJBLRrNtybTvH88KGjCf2jK86bhzmMcwDKFZEQvbIhxFYhChoMWMzU2iWznlIBEVJOsP+1bdX/ALx9l7jApADeDAEcMkE90JnUmmGl4USKQ0xhoW3JB5XY0YrxYWhLwMZZypUyjDGH35AbNwgUGiFBPpuGbHCpAOV1ZGXf2f/taftAv31DyeymN2d1IhAFAwTOmnzF/kKcdh3me7CYCOVNgycju84u8DeVlwfFq9/ZlTfldYrMUjOlrkjkD+rU+WzCROkcEchIDHR011syZW9JHD7y07N6JvhWMpz3pugaTkB6lWFVCKkhck0zzeMp2utq+uHrmfxOgoCO/Z8CXPlEQ1bdH8wgvhSIkEG0ICcQeExIFGdimjvKka7btJFZuaXOammIGKUCFQ53j9EN1dYKWqHf0t2w407W2tgs6h89ZnImjB55flh81tt9XirjjDuSl+oIPRQ0iWPgNZ5GqTqbBe3vSzEl5n5PhWKwocyR2HlqYN61qV18WjYjE8JLARZPQsUSim8foIRYTlGr02Ly7piASFRtKJ4VfieYhxdS2JcDVMN6xVOKZyrCGm8b108lrLRVzvptLH7IoEFLFANes6KnDi+uxfmvFnF17oALq5u1agu3/YfHkcSFzeSggV5eXRfIB7CHNcO5SUI+Ih5Ir7f4MAV9IqdFzdZgNpZw1Gcs1mNvgGbTbqQ9/cz7ZuuhgyYRQ49ljTyWHhr2DwpNHHFf+5gnWZ3Bharo+0TD5dNMw5vv9RlVpSRDHK4TlnoukhtYApuOHejSZQuo5g/A9BysdKRCyLl6062fN37OXMDlvUJtUrtmxo0avrW3wTrYs3jJ9RvRVChrmSmanPMpX2OXMsmDGh6AiEIwBAlvkOqIdBy+8JyAz8pz7QxiDth4KDy5uAlwzrWTnwC8Vc4KVAMZ3YUZ+IqoIjP3h5KFFX1ZMy3uW+7RhEDHgTi0zC9rS7uhPCDiNrGFyqBeERtKN/B0YlyFCkw0NJ5C0Ojv7zvT1a1WV1TuvZDdL4NTgB7CASYpsen6gqvG5jmTf5qHedADgkBl3D0nkSgNhZACDyi0FUKZRr3IdRjgN4WPPoFMIIegIK3mqd38fS80mcJKelM4szNyzZtQbkchGePuBRS8Eg9pHU8ojRQpSqs+ajAIwTjjUMQ/nvTNM0kicwYxZIYMh/891DYi+fvedB+c1xsm4lDU6ya+Axtz+RiAzEVYbajQOpq17F0R9QevNcEhfcU+xvyQQUalGJBSesqOkgPQ4YNyUZL9fSvUPDjoNAwN8/dwFjaczNkc3ptaMud1EIDtGcmXTcefO2cGSvKIFfp/2JIJxlq7xEl3nVPM4fDeIbPkD16/ptNc0bDu7qxbsu0R2JGywWMIjF2ft3tjfloAyQAGXiOn8hrqwbVvMXzaO+QeHXP6nF0wvX74Hf4NGG5GPjSlYoyM3P/0FbCT6zvM/yYoAAAAASUVORK5CYII=" width="16" height="16" role="presentation" style="pointer-events: none;"></div></div><div class="react-toggle-thumb"></div></div> </div> <button type="button" class="navbar-toggle collapsed" data-toggle="collapse" data-target="#navbar" aria-expanded="false" aria-controls="navbar"> <span class="sr-only">菜单</span> <span class="icon-bar"></span> <span class="icon-bar"></span> <span class="icon-bar"></span> </button> </div> <div id="navbar" class="collapse navbar-collapse"> <ul class="nav navbar-nav"> <li class="index_nav index_index"><a href="/" title='刘悦'>Home</a></li> <li class="index_nav index_1"><a href="/l_id_1" title='python编程'>Python</a></li> <li class="index_nav index_2"><a href="/l_id_2" title='前端技术'>Web Design</a></li> <li class="index_nav index_3"><a href="/l_id_3" title='数据库相关技术(mysql,redis)'>Db & SQL</a></li> <li class="index_nav index_4"><a href="/l_id_4" title='Mac & Linux(苹果系统和linux相关技术)'>Mac & Linux</a></li> <li class="index_nav index_5"><a href="/l_id_5" title='Go 和 Ruby 相关实践'>Go & Ruby</a></li> <li class="index_nav index_6"><a href="/l_id_6" title='生活和工作'>Life & Work</a></li> <li class="index_nav index_7"><a href="/resume" title='刘悦简历'>Resume</a></li> </ul> <div class="react-toggle bigtoggle"> <div class="react-toggle-track"><div class="react-toggle-track-check"><img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAgCAYAAABzenr0AAAAAXNSR0IArs4c6QAAAAlwSFlzAAALEwAACxMBAJqcGAAAAVlpVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IlhNUCBDb3JlIDUuNC4wIj4KICAgPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4KICAgICAgPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIKICAgICAgICAgICAgeG1sbnM6dGlmZj0iaHR0cDovL25zLmFkb2JlLmNvbS90aWZmLzEuMC8iPgogICAgICAgICA8dGlmZjpPcmllbnRhdGlvbj4xPC90aWZmOk9yaWVudGF0aW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4KTMInWQAABwNJREFUWAmtV1tsFFUY/s6Z2d22zLYlZakUCRVaQcqlWIiCiS1gTEB9UAO+GR9En3iQGI0xJiSiRB98MjEq8cEQTSBeHhQM0V7whtEGDWC90BYitxahtNtu25058/v/ZzvLbilawJNM5+yZ89+//1LgJhYRNLW1uDfBAvpGiIk2O5auvfFxqIH3ZJ8/u06GN6Z9+wVl5SjcD1IbZa/UPkPyYl2uR4dreoD2bnbYxTlBBRytkHXtAREphP5KuH4lddx9h70yxX05t7yYXwGb6W8nx1jibpl2rFlGBxcG9M18okOrn7Bnk/BAO/4bI0UeEE1zjBp3UmvjOxJXJdaKN/ZiIu4tOZrAb4aTdZAZArKmWeiiJZ6jt5tiagdCS9+6cgO1Ne6Mvhe+ixTIfyDVhipnK9p+P0Edqx9RW/YZtQVGmOLChRxNNlyPsTEgPQKMB3dbEHa0h1awYmQ83enTd2vmUtvKd1Glv2RkzBb+kZGRrKtjzG60Wguhd/lJZBingbcfWWe72vjT75bJDrhYtvA0hrurETDr5HyF2Knb1MM4ab//xIoOqueA0edRnkkinTyJdYvqLFDZO4zUPFCvVoDjJq4T7TE61IWh4x5KqxX5KVKkX8WZ/t2ov2cb3MHt4dhIyOxIJxJOOF6xRx/99BksXLoecWcXytILMNBDqKpnGZWPquYfPxY8iXGR9fK+SgFrgcRPXPjVqhehL+3EmZ5RGJQi1QBU8TPThQnOQzm+5UXGIcetUeEAfP13VwzpI+w1jGJWdSliNfvVhiMPiOsllJag4M/UGHiqM6dlBb2OTLKHHV6KkvogrJ4XhBWniWK/Gp1MQyf93FOeUXKmKk/FzJxbQtKLjFXYT4USupy8fQVir2ynVEBiZMG0qtOHMS/AW4Gwrk7BG3C1F0B5nqNKE0CME4MfVRLPnXkBKe+ipvoFhNQywOhdghvLi0F8ReyVXV4BKTBRbbe5f64zR/DHsdZw1hJfeWlHl/GNRJzDxrd5m192z78TMaVnKELZoINZS4BzQ7vtnZljSnha/pPCbkuxzXcupYwI5tIeCpGc0Yp9tWHZQy/rmYhRfNgg4bHJBYLzGkxsRJF4XKlE2jBOHNSv3kY7Tj6vthzPFl61BrYwqFlmEQhtSVXmLiksxLmtRgYXI1ULU61JJ4eVKmG3/5sCVgpbMT6OMJ2E08/29Xf3w6v4FnHdCjfWgXu/O8Z5mLdCkeRs2khHe1DqOtQwbHWTAnM5S2HNmhALYo5KjkPFrMMKjZl6HxhWIAb0BqE+/73GrBRQUsKYiBu4JX8ycI6wtw+i5ef3NZpsrKVSHYCP37jwGDgeE1SA0S/xtl5SU2fs1ApEp0qTLVRjgyycDSsLHMSwmFltZMStR3uLLg6BdLhDa5dC6ryU2pHBe1BVO9tUcwfitJt2CLJZUHoG6T7Op75u0IyK31TCPcwFqgPk/KCaD3dFOuZBCO7xvCT/j048b3I3c7F2+WuOW7qdgkucFYlcQ4qop3yzTX7WaKfOCccye3Ts1Etq0+a/BHCF1yPgF3tAUkR6OrtGmo6gl94qqcXKh3rDyrOkPa58URoWcov2Mo6M+0QjrqKB+b7++oMa9Sz+ZkM0mie6aAtnGUvhmxaI+TogPOSQedgWioGSHFLn3v4kLh4HRspNmOGv41k+55siLFp2z6xYeJjhljFcbmxJlr4ga06TbevSByz/glQq4BJx46/c+237PbBqEYKxX3HpmKZEnQnr65X20hqJYaNcLoFOLiJk2LuBbyg7Q0OEn+hm0P3honxFD6rdxYorKpeIoi4YSSvyQHQIbM5t4+YNxLj/OxhVOOE4585qGpjnq+wSx6Q9CtNxTjd5klB+g6Mv36r0+b9cZFi44WYkHdG2ZWb3TtOUOXyVAlKlpGvJIAJ3eBMyfYS5C0qRZGtC85j+4sOasDe9xznPYezhhO/2Q6eP2fSOvYHOjtuQ1a9Q1VKynVDaMc8E0tptdxUsTFpFIYjcZKcbnoaQTNdiqCwNlL4G7oziSqGnT1ALf34vhk4R5zU3qYV9ONp9K88RtouShE68JwaU8dFw5W617shWa9ykeaBIn2hcsvPgL00k45QdTCZuSVcTRNs+8fnyLvooQfR5iujAnR9bxfY2xOVOxFS8SK3Le0l48VyYu1M8HRe5JD8wKPTjYnifaK3Wfn/GChYQ8ZAi6WRzWgqLV5YrsVLnZaVSoXU1g9gOIDwFySiGi+Zdrnzr7J3r+SMuszlcQCRn8lNGcTuSy2jOI7o9mxjZo+vR3ej3tN+ifRSOyUTS0+VMOid93cCubeiy/6TImS0QxRSCq2vxKr45zV+FQnjWH6D2xg+E9EatLcLAdHTgtGGD80D6jM0+aOl4wJgO/f96R2aJKCQ3yvgftRhdFMOpd6oAAAAASUVORK5CYII=" width="16" height="16" role="presentation" style="pointer-events: none;"></div> <div class="react-toggle-track-x"><img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACAAAAAgCAYAAABzenr0AAAAAXNSR0IArs4c6QAAAAlwSFlzAAALEwAACxMBAJqcGAAAAVlpVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IlhNUCBDb3JlIDUuNC4wIj4KICAgPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4KICAgICAgPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIKICAgICAgICAgICAgeG1sbnM6dGlmZj0iaHR0cDovL25zLmFkb2JlLmNvbS90aWZmLzEuMC8iPgogICAgICAgICA8dGlmZjpPcmllbnRhdGlvbj4xPC90aWZmOk9yaWVudGF0aW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4KTMInWQAABlJJREFUWAm1V3tsFEUcntnXvXu0tBWo1ZZHihBjCEWqkHiNaMLDRKOtQSKaiCFKQtS/SbxiFCHGCIkmkBSMwZhQNTFoQZD0DFiwtCDFAkdDqBBBKFj63rvdnfH7zfVo5aFBj0l2Z/dm5vd98/0es8dYjlpr62azufnDQNZcU1PciMfjWvb9rvZSMk4Ayfb36pLH13189GC8LAtIRLLPt+pzwrCuLq4ISEv/gHmitrAwfPbEkXc/ad4dL6iujrvyX0jcitgd/yZlZqftP6995Mr5TVLa22Tn8XVX2g/XLSRjUu7Q79jonS7I7hS7/0oOb5VyqF52n98oj7esXX07EjlxwXWisRmSnm3b29TTM8iYrjmFBWExubxwY/uhNas4r/WySl1fc5cetDMd7ydl+lMJJRw5WC8ud62Xx5rfepzwxgZmbhUYNS5Stvsj4yo2GXJEFBVHWDBkfdbR9HpYBaaUajDnBLKKpl1xRKYcgGtMCqEzTaSnThk/SQT0uJqTqFNBmXMCsZE48DzRZRMBRjv1GHNdk3HBImF9ZUvTyxM40pMKVc4JZBXQOLOFoDeKSxdp6HIQcO4rjYT9fn0pjbz9GLt7BAAODmjSVReXUMFzNW5x5vfxp2mIxZjIuQKJxAmFa+is2DQJJQ0JyBVExNOYcJnPxx/6/utnijmP555ALEagKAGGnGn64QORBjARcIA/yJk7JMJBLRrNtybTvH88KGjCf2jK86bhzmMcwDKFZEQvbIhxFYhChoMWMzU2iWznlIBEVJOsP+1bdX/ALx9l7jApADeDAEcMkE90JnUmmGl4USKQ0xhoW3JB5XY0YrxYWhLwMZZypUyjDGH35AbNwgUGiFBPpuGbHCpAOV1ZGXf2f/taftAv31DyeymN2d1IhAFAwTOmnzF/kKcdh3me7CYCOVNgycju84u8DeVlwfFq9/ZlTfldYrMUjOlrkjkD+rU+WzCROkcEchIDHR011syZW9JHD7y07N6JvhWMpz3pugaTkB6lWFVCKkhck0zzeMp2utq+uHrmfxOgoCO/Z8CXPlEQ1bdH8wgvhSIkEG0ICcQeExIFGdimjvKka7btJFZuaXOammIGKUCFQ53j9EN1dYKWqHf0t2w407W2tgs6h89ZnImjB55flh81tt9XirjjDuSl+oIPRQ0iWPgNZ5GqTqbBe3vSzEl5n5PhWKwocyR2HlqYN61qV18WjYjE8JLARZPQsUSim8foIRYTlGr02Ly7piASFRtKJ4VfieYhxdS2JcDVMN6xVOKZyrCGm8b108lrLRVzvptLH7IoEFLFANes6KnDi+uxfmvFnF17oALq5u1agu3/YfHkcSFzeSggV5eXRfIB7CHNcO5SUI+Ih5Ir7f4MAV9IqdFzdZgNpZw1Gcs1mNvgGbTbqQ9/cz7ZuuhgyYRQ49ljTyWHhr2DwpNHHFf+5gnWZ3Bharo+0TD5dNMw5vv9RlVpSRDHK4TlnoukhtYApuOHejSZQuo5g/A9BysdKRCyLl6062fN37OXMDlvUJtUrtmxo0avrW3wTrYs3jJ9RvRVChrmSmanPMpX2OXMsmDGh6AiEIwBAlvkOqIdBy+8JyAz8pz7QxiDth4KDy5uAlwzrWTnwC8Vc4KVAMZ3YUZ+IqoIjP3h5KFFX1ZMy3uW+7RhEDHgTi0zC9rS7uhPCDiNrGFyqBeERtKN/B0YlyFCkw0NJ5C0Ojv7zvT1a1WV1TuvZDdL4NTgB7CASYpsen6gqvG5jmTf5qHedADgkBl3D0nkSgNhZACDyi0FUKZRr3IdRjgN4WPPoFMIIegIK3mqd38fS80mcJKelM4szNyzZtQbkchGePuBRS8Eg9pHU8ojRQpSqs+ajAIwTjjUMQ/nvTNM0kicwYxZIYMh/891DYi+fvedB+c1xsm4lDU6ya+Axtz+RiAzEVYbajQOpq17F0R9QevNcEhfcU+xvyQQUalGJBSesqOkgPQ4YNyUZL9fSvUPDjoNAwN8/dwFjaczNkc3ptaMud1EIDtGcmXTcefO2cGSvKIFfp/2JIJxlq7xEl3nVPM4fDeIbPkD16/ptNc0bDu7qxbsu0R2JGywWMIjF2ft3tjfloAyQAGXiOn8hrqwbVvMXzaO+QeHXP6nF0wvX74Hf4NGG5GPjSlYoyM3P/0FbCT6zvM/yYoAAAAASUVORK5CYII=" width="16" height="16" role="presentation" style="pointer-events: none;"></div></div><div class="react-toggle-thumb"></div></div> <div class="search navbar-right" > <form action="/Index_search" method ="GET" class="search_form" > <input type="search" name="text" class="search_input" placeholder="Search" required="required" > </form> </div> </div> </div> </nav>随后,在需要头部导航的模板进行引入操作,比如修改signin.html:${ render "header.html" }注意,使用${}是为了避免和前端的Vue.js标签冲突。同样地,将封装axios.js逻辑单独抽取出来myaxios.html:<script> const myaxios = function (url, type, data = {}) { return new Promise((resolve) => { if (type === "get" || type === "delete") { axios({ method: type, url: url, params: data }).then((result) => { resolve(result.data); } else { const params = new URLSearchParams(); for (var key in data) { params.append(key,data[key]); axios({ method: type, url: url, data:params }).then((result) => { resolve(result.data); </script>然后在需要的地方进行引入操作:${ render "myaxios.html" }如此,我们只需要维护模板的公共部分即可。修改后,项目的结构如下:. ├── README.md ├── assets │ ├── css │ │ └── style.css │ └── js │ ├── axios.js │ └── vue.js ├── database │ └── database.go ├── favicon.ico ├── go.mod ├── go.sum ├── handler │ ├── admin.go │ └── user.go ├── main.go ├── model │ └── model.go ├── mytool │ └── mytool.go ├── tmp │ └── runner-build └── views ├── admin │ └── user.html ├── admin_header.html ├── header.html ├── index.html ├── myaxios.html ├── signin.html └── test.htmlJWT生成逻辑JSON Web Token (JWT)是一个互联网应用的开放标准(RFC 7519),它定义了一种紧凑的、自包含的方式,用于作为JSON对象在各方之间安全地传输信息,这种信息可以被验证和信任,因为它是数字签名的。说白了,登录成功以后,生成一个 JSON 对象,返回给前端,就像下面这样:{ "uid":1 }这之后,用户与服务端通信的时候,所有请求都要带着这个JSON 对象。服务器完全只靠这个对象认定用户身份。为了防止JSON对象被篡改,服务器在生成这个对象的时候,会加上签名,变成这种模样:eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJVaWQiOjEsImlhdCI6MTY2MTg0MjYxMCwiZXhwIjoxNjYxODQ1NjEwfQ.BXK9awvVCk7L3JAnDGt9z6U9TOjPCpI0AcHRu1eq_mo这玩意就是普遍意义上的Token,俗称“令牌”。在我们的项目中,需要为登录校验通过的账号生成Token,首先安装jwt包:go get -u github.com/kataras/iris/v12/middleware/jwt随后修改本地的工具包mytool.jwt:package mytool import ( "crypto/md5" "fmt" "time" "github.com/dchest/captcha" "github.com/kataras/iris/v12" "github.com/kataras/iris/v12/middleware/jwt" var SigKey = []byte("signature_hmac_secret_shared_key") type PlayLoad struct { Uid uint func GenerateToken(uid uint) string { signer := jwt.NewSigner(jwt.HS256, SigKey, 50*time.Minute) claims := PlayLoad{Uid: uid} token, err := signer.Sign(claims) if err != nil { fmt.Println(err) s := string(token) return s 众所周知,Token由三部分组成:头部、载荷以及秘钥。这里SigKey字符为签名秘钥,PlayLoad结构体为载荷信息,这里通过签名生成一个字符格式的token,注意返回前端时,需要强转为字符串。需要注意的是,生成签名时使用的是HS256算法,同时为了确保安全性,token设置生命周期,这里为50分钟。随后修改用户登录逻辑,将生成好的token返回给前端://登录动作 func Signin(ctx iris.Context) { ret := make(map[string]interface{}, 0) cid := ctx.PostValue("cid") code := ctx.PostValue("code") if captcha.VerifyString(cid, code) == false { ret["errcode"] = 2 ret["msg"] = "登录失败,验证码错误" ctx.JSON(ret) return db := database.Db() defer func() { _ = db.Close() Username := ctx.PostValue("username") Password := ctx.PostValue("password") user := &model.User{} db.Where(&model.User{Username: Username, Password: mytool.Make_password((Password))}).First(&user) if user.ID == 0 { ret["errcode"] = 1 ret["msg"] = "登录失败,账号或者密码错误" ctx.JSON(ret) return token := mytool.GenerateToken(user.ID) fmt.Println(token) ret["errcode"] = 0 ret["msg"] = "登录成功" ret["username"] = user.Username ret["token"] = token ctx.JSON(ret) }前端接收的返回值为:{ "errcode": 0, "msg": "登录成功", "token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJVaWQiOjEsImlhdCI6MTY2MTg0MzI5MSwiZXhwIjoxNjYxODQ2MjkxfQ.547z3nv4qj2-UeHTzfeG_qSsnFZD2DFyCP9gNZ-QiHA", "username": "123" }随后前端收到token后,将其存储localstorage,然后跳转到后台页面://登录请求 signin:function(){ this.myaxios("http://localhost:5000/signin/","post",{"username":this.username,"password":this.password,"cid":this.cid,"code":this.code}).then(data => { console.log(data) alert(data.msg); localStorage.setItem("token",data.token); window.location.href = "/admin/user/" }localStorage存储是永久的,如果对安全性要求较高,可以采用sessionStorage,token生命周期就会跟随浏览器进程,但之前设置的token生命周期就没意义了,各有利弊,各自权衡。中间件(Middleware)鉴权所谓中间件,是一类提供系统软件和应用软件之间连接、便于软件各部件之间的沟通的软件,应用软件可以借助中间件在不同的技术架构之间共享信息与资源。说白了,就是在所有需要鉴权的接口前面加一层逻辑,便于批量管理和控制:verifier := jwt.NewVerifier(jwt.HS256, mytool.SigKey) verifyMiddleware := verifier.Verify(func() interface{} { return new(mytool.PlayLoad) })这里声明中间件变量verifyMiddleware,该变量会返回一个载荷结构体对象。随后,为所有的后台接口、包括后台模板添加中间件:adminhandler := app.Party("/admin") adminhandler.Use(verifyMiddleware) adminhandler.Get("/userlist/", handler.Admin_userlist) adminhandler.Delete("/user_action/", handler.Admin_userdel) adminhandler.Put("/user_action/", handler.Admin_userupdate) adminhandler.Post("/user_action/", handler.Admin_useradd) adminhandler.Get("/user/", handler.Admin_user_page)如此,所有后台操作都需要中间件的鉴权操作。换句话说,如果请求地址中没有token或者token不合法,就不会返回正常数据。访问:http://localhost:5000/admin/user 如图所示:如果带着token:当然了,之后所有的后台请求都需要携带token,所以改造上面封装的myaxios.html:<script> var mytoken = localStorage.getItem("token"); const myaxios = function (url, type, data = {}) { return new Promise((resolve) => { if (type === "get" || type === "delete") { axios({ method: type, url: url+"?token="+mytoken, params: data }).then((result) => { resolve(result.data); } else { const params = new URLSearchParams(); for (var key in data) { params.append(key,data[key]); axios({ method: type, url: url+"?token="+mytoken, data:params }).then((result) => { resolve(result.data); </script>藉此,每个后端请求都会携带token。结语JWT形式的认证体系将用户状态分散到了客户端中,相比于服务端的session存储,可以明显减轻服务端的内存压力,此外,使用Iris中间件鉴权的方式有助于提高代码的重用性,也更便于维护,更加优雅。该项目已开源在Github:https://github.com/zcxey2911/IrisBlog ,与君共觞,和君共勉。

影片自由,丝滑流畅,Docker容器基于WebDav协议通过Alist挂载(百度网盘/阿里云盘)Python3.10接入

使用过NAS(Network Attached Storage)的朋友都知道,它可以通过局域网将本地硬盘转换为局域网内的“网盘”,简单理解就是搭建自己的“私有云”,但是硬件和网络成本都太高了,有点可望而不可及的意思。Alist开源库则可以满足我们,它能将公共网盘反过来变成一种联网的本地硬盘,使用Web页面来统一挂载和管理,网盘类型包含但不限于:百度网盘、阿里云盘、迅雷网盘等等。Alist挂载网盘的另外一个好处是可以基于WebDav协议直接播放网盘资源,虽然说网盘也支持在线播放功能,但是代价就是得充会员,没错,这符合逻辑,网盘主机厂也得盈利,但Alist技术可以帮助我们曲线救国,节省一笔开支。 此外,使用WebDAV的精髓在于WebDAV可以被挂载为一个本地(服务器)磁盘,正因为WebDAV可以被映射为一个本地目录,所以只需要调用本地播放器或者本地搭载的浏览器播放器进行播放。无论是mkv、wmv或是h.265编码方案,通过一个现代的本地播放器都能完美的播放,不存在需要转码的情况,所以,使用WebDAV协议,服务器的负担只有传输数据这一个任务。Docker部署AlistAlist软件可以通过多种方式进行安装和部署,但最方便的,还是通过Docker,主要是因为由于各大网盘主机厂的网盘版本更新频率很快,所以Alist的版本也会随之频繁更新,而Docker的操作最简单快捷,只需要简单的命令就可以完成部署,更适合这种频繁更新的情况。 关于Docker请移玉步至一寸宕机一寸血,十万容器十万兵|Win10/Mac系统下基于Kubernetes(k8s)搭建Gunicorn+Flask高可用Web集群,这里不作过多赘述。 首先在终端执行命令:docker run -d --restart=always -v /etc/alist:/opt/alist/data -p 5244:5244 -e PUID=0 -e PGID=0 -e UMASK=022 --name="alist" xhofe/alist:latest该命令会在后台生成一个Alist容器,服务运行在系统的5244端口,如果是首次运行,会拉取最新的Alist镜像:➜ interview git:(main) docker run -d --restart=always -v /etc/alist:/opt/alist/data -p 5244:5244 -e PUID=0 -e PGID=0 -e UMASK=022 --name="alist" xhofe/alist:latest Unable to find image 'xhofe/alist:latest' locally latest: Pulling from xhofe/alist b1101342f8ad: Pull complete d9f5c37d20f9: Pull complete 5f4a1655e3cc: Pull complete c1e599f8ce92: Pull complete d613bea8ea45: Pull complete Digest: sha256:520e531ddaf5732c4944d5c35ad4dbb601e2fadae14b99a81e86ea3f7e065173 Status: Downloaded newer image for xhofe/alist:latest 7bf1c7f384526bd22aa078223d548ab0c16b79c245919e8a0cf7b439e79f34d6随后执行命令:docker ps就可以看到正在运行的Alist服务容器:➜ ~ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 7bf1c7f38452 xhofe/alist:latest "/entrypoint.sh" 3 hours ago Up 3 hours 0.0.0.0:5244->5244/tcp alist ➜ ~Alist服务平台基于前后端分离的Gin和React,所以平台管理页面需要用户名和密码才能登入,输入命令:docker exec -it alist ./alist admin该命令会进入容器并展示账号和密码:INFO[2023-02-13 22:54:17] admin user's info: username: admin password: 8U5js3bH记录下来,注意这是本地的服务,所以外网是无法进行登录的。 至此,Alist的本地部署就完成了,假如Alist发了新的版本,也可以通过下面的命令进行更新操作:docker stop alist #停止alist容器 docker rm -f alist #删除alist容器,因为之前映射到了本地,所以数据不会被删除 cp -r /root/data/docker_data/alist /root/data/docker_data/alist.bak #可选,如果不放心,可以备份一下数据 docker pull xhofe/alist:latest #拉取最新的alist镜像 docker run -d --restart=always -v /root/data/docker_data/alist:/opt/alist/data -p 5244:5244 --name="alist" xhofe/alist:latest #运行安装命令,注意-v挂载的路径与原来相同这里的区别就是通过挂载命令将alist的配置文件挂载到宿主机的/root/data/docker\_data/alist目录,方便升级后进行使用。挂载百度网盘部署好Alist服务后,访问本地网址进行登录:http://localhost:5244/@manage 用户名和密码就是上文中Docker中返回的,登录成功后,选择左侧菜单中的存储,添加百度网盘: 百度云盘的操作完全基于百度云的开放API,只要给Alist授权操作接口的权限即可,进入网址:https://tool.nn.ci/baidu/callback?code=288faa8f669a3d174ea9af0bd1d72ab5 进行授权操作,记录client\_id、client\_secret和refresh\_token,分别将三个参数填入挂载的表单中,然后挂载目录填入根目录:/,注意表单中最好把web代理选项勾选。随后进入Alist服务首页:http://localhost:5244,就可以在线播放百度云内存储的资源:非常方便。挂载阿里云盘截止到本文发布的2-14号,阿里云盘目前挂载过程中会出现设备id的bug,但是挂载阿里云盘分享的网盘还是没问题的,由于阿里云盘操作基于客户端的token,所以必须先通过移动端登录页面来获取token: https://passport.aliyundrive.com/mini\_login.htm?lang=zh\_cn&appName=aliyun\_drive&appEntrance=web&styleType=auto&bizParams=¬LoadSsoView=false¬KeepLogin=false&isMobile=true&hidePhoneCode=true&rnd=0.9186864872885723登录成功后,通过抓包,获取后端login.do接口的返回值: 将bizExt的值复制出来,然后利用Python的Base64模块进行解码操作:import base64 coded_string = '''Q5YACgA...''' base64.b64decode(coded_string)解码出来的refreshToken就是我们需要的令牌:"refreshToken":"sdfdsfsdfdsfb9fadd4f62ee4be968e"随后在后台将token和分享的id填入表单即可: 注意这里挂载路径不能填入根目录/,因为之前我们已经挂载了百度网盘了,所以选择一个子目录share。 至此,阿里云盘分享就挂载好了,可以坐下来,犒劳自己了: Python3.10接入除了在线播放,我们还可以使用Python3.10直接通过WebDav协议操作Alist挂载的网盘,可谓是神乎其技了。 首先安装WebDav库:pip3 install webdavclient3随后编写webdav.py文件:from webdav3.client import Client options = { 'webdav_hostname': "http://localhost:5244/dav", 'webdav_login': "admin", 'webdav_password': "8U5js3bH" client = Client(options) client.verify = False # To not check SSL certificates (Default = True) files1 = client.list() print(files1)这里的webdav\_hostname指的是刚才用docker挂载的webdav服务路径,账号和密码是上文中docker返回的,不用担心外泄,因为是本地服务。 程序返回:➜ gotest /opt/homebrew/bin/python3.10 "/Users/liuyue/wodfan/work/gotest/webdav.py" ['dav/', 'aliyunpan/', 'The.Last.of.Us.S01E03.1080p.WEB-DL.DDP5.1.Atmos.H.264-Q66.mkv', 'The.Last.of.Us.S01E05.1080p.WEB-DL.DDP5.1.Atmos.H.264-Q66.mkv', 'The.Last.of.Us.S01E04.1080p.WEB-DL.DDP5.1.Atmos.H.264-Q66.mkv', 'house.of.the.dragon.s01e08.1080p.web.h264-cakes.chs.eng.mp4', 'House.of.the.Dragon.S01E07.Driftmark.1080p.HMAX.WEB-DL.DDP5.1.Atmos.H.264-SMURF.chs.eng.mp4', 'House.of.the.Dragon.S01E06.The.Princess.and.the.Queen.720p.HMAX.WEB-DL.DDP5.1.H.264-NTb.chs.eng.mp4', 'House.of.the.Dragon.S01E05.We.Light.the.Way.1080p.HMAX.WEB-DL.DDP5.1.Atmos.H.264-SMURF.chs.eng.mp4', 'house.of.the.dragon.s01e04.720p.web.h264-cakes.chs.eng.mp4', 'house.of.the.dragon.s01e03.720p.web.h264-cakes.chs.eng.mp4', 'share/']可以很方便的将挂在后的网盘文件目录进行返回。 除此之外,我们也可以针对网盘资源进行增删改查的动态操作:# Create directory client.mkdir("dir1/dir2") # Delete resource client.clean("dir1/dir2") # Copy resource client.copy(remote_path_from="dir1/file1", remote_path_to="dir2/file1") client.copy(remote_path_from="dir2", remote_path_to="dir3") # Move resource client.move(remote_path_from="dir1/file1", remote_path_to="dir2/file1") client.move(remote_path_from="dir2", remote_path_to="dir3") # Download a resource client.download_sync(remote_path="dir1/file1", local_path="~/Downloads/file1") client.download_sync(remote_path="dir1/dir2/", local_path="~/Downloads/dir2/") # Upload resource client.upload_sync(remote_path="dir1/file1", local_path="~/Documents/file1") client.upload_sync(remote_path="dir1/dir2/", local_path="~/Documents/dir2/")也就是说,只要Alist服务已经挂载好网盘,我们甚至不需要平台界面,只编写代码就可以对网盘资源予取予求。结语旧时王谢堂前燕,飞入寻常百姓家。只要一台联网的电脑,就可以实现自己的“私有云”,成本低到令人发指,Alist,新时代的普罗米修斯,为我们带来了网盘自由的火种。

急如闪电快如风,彩虹女神跃长空,Go语言进阶之Go语言高性能Web框架Iris项目实战-初始化项目EP00

在Golang Web编程的世界里,君不言高性能则已,言高性能必称Iris。彩虹女神的名号响彻寰宇、名动江湖,单论一个快字,无人能出其右,就连以简洁轻量著称于世的Gin也难以望其项背,只见彩虹女神Iris回眸一笑撩人心扉:“虽然你们也不是那么慢,但我还是快那么一点点......”,本次就让我们来一睹彩虹女神Iris的芳颜,感受宇宙最快Web框架的神乎其神。女神本神(Iris)选择一款框架有诸多的参考层面,比如灵活性、扩展性、API友好程度、文档详细程度、项目活跃度、社区贡献等等,但是性能和内存占用绝对是优先参考的一个重要层面,原因无他,天下武功,唯快不破,正所谓一快遮百丑,经济下行,降本增效的大背景之下,高性能框架无疑占据极大的优势,说白了,成本相仿的前提下,我单位时间内网络请求吞吐量是你的一倍,还没用力,你就倒下了,你怎么跟我打?游戏还没开始,就已经结束了。空口白牙,不足为据,参见2022年最新请求吞吐量对比图:事实上,Iris本质上也是社区驱动的Go语言Web框架,支持http2/3,完备的MVC支持,奉行极简主义风格,轻量化与简明风格比起Gin来说,也不遑多让,与此同时,社区活跃度和文档支持都非常到位,但其拥有的极其恐怖的高性能特性,其他框架则是望尘莫及。在Iris身上,我们可以看到她对性能的近乎于偏执的完美追求,Iris为了优化性能,不惜自己开发和集成最快的组件,比如日志记录内置了golog模块,比如json序列化就选择了第三方库jsoniter,从框架设计的态度上,极尽完美之能事。建立项目IrisBlog参照Iris官网文档:https://github.com/kataras/iris/blob/master/README\_ZH.md,我们借助彩虹女神Iris的垂爱,打造一款史上最快的在线博客系统,首先建立文件夹IrisBlog:mkdir IrisBlog cd IrisBlog随后通过go mod命令初始化项目:C:\Users\liuyue\www\iriblog>go mod init IrisBlog go: creating new go.mod: module IrisBlog对于go mod不熟的朋友,请移玉步至层次分明井然有条,Go lang1.18入门精炼教程,由白丁入鸿儒,Go lang包管理机制(package)EP10,关于go mod的使用,这里不再赘述。接着,由于诸位可以理解的原因,请确保使用国内的安装源:go env -w GOPROXY=https://goproxy.cn,direct随后安装彩虹女神Iris: go get github.com/kataras/iris/v12@master 系统返回: C:\Users\liuyue\www\iriblog>go get -u github.com/kataras/iris go: downloading github.com/kataras/iris v0.0.2 go: downloading github.com/BurntSushi/toml v0.3.1 go: downloading github.com/kataras/golog v0.0.18 go: downloading github.com/kataras/pio v0.0.8 go: downloading github.com/kataras/sitemap v0.0.5 go: downloading github.com/BurntSushi/toml v1.2.0 go: downloading github.com/kataras/tunnel v0.0.1 go: downloading github.com/kataras/golog v0.1.7 go: downloading github.com/kataras/pio v0.0.10 go: downloading gopkg.in/yaml.v3 v3.0.0-20200615113413-eeeca48fe776 go: downloading github.com/kataras/tunnel v0.0.4 go: downloading gopkg.in/yaml.v3 v3.0.1 go: downloading github.com/Shopify/goreferrer v0.0.0-20181106222321-ec9c9a553398 go: downloading github.com/fatih/structs v1.1.0 go: downloading github.com/andybalholm/brotli v1.0.1-0.20200619015827-c3da72aa01ed go: downloading github.com/iris-contrib/schema v0.0.2 go: downloading github.com/andybalholm/brotli v1.0.4 go: downloading github.com/iris-contrib/schema v0.0.6 go: downloading github.com/json-iterator/go v1.1.10 go: downloading github.com/Shopify/goreferrer v0.0.0-20220729165902-8cddb4f5de06 go: downloading github.com/klauspost/compress v1.10.10 go: downloading github.com/klauspost/compress v1.15.9 go: downloading github.com/microcosm-cc/bluemonday v1.0.3 go: downloading github.com/russross/blackfriday/v2 v2.0.1 go: downloading github.com/vmihailenco/msgpack/v5 v5.0.0-beta.1 go: downloading golang.org/x/net v0.0.0-20200707034311-ab3426394381 go: downloading github.com/russross/blackfriday v1.5.2 go: downloading github.com/vmihailenco/msgpack v4.0.4+incompatible go: downloading github.com/microcosm-cc/bluemonday v1.0.19 go: downloading github.com/vmihailenco/msgpack/v5 v5.3.5 go: downloading golang.org/x/text v0.3.3 go: downloading golang.org/x/time v0.0.0-20200630173020-3af7569d3a1e go: downloading github.com/russross/blackfriday/v2 v2.1.0 go: downloading github.com/russross/blackfriday v1.6.0 go: downloading google.golang.org/protobuf v1.25.0 go: downloading golang.org/x/time v0.0.0-20220722155302-e5dcc9cfc0b9 go: downloading golang.org/x/net v0.0.0-20220812174116-3211cb980234 go: downloading google.golang.org/protobuf v1.28.1 go: downloading golang.org/x/text v0.3.7 go: downloading golang.org/x/crypto v0.0.0-20200728195943-123391ffb6de go: downloading golang.org/x/sys v0.0.0-20200808120158-1030fc2bf1d9 go: downloading github.com/schollz/closestmatch v2.1.0+incompatible go: downloading golang.org/x/crypto v0.0.0-20220722155217-630584e8d5aa go: downloading golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab go: downloading gopkg.in/ini.v1 v1.57.0 go: downloading gopkg.in/ini.v1 v1.67.0 go: downloading github.com/ryanuber/columnize v2.1.0+incompatible go: downloading github.com/CloudyKit/jet/v4 v4.1.0 go: downloading github.com/aymerick/raymond v2.0.3-0.20180322193309-b565731e1464+incompatible go: downloading github.com/eknkc/amber v0.0.0-20171010120322-cdade1c07385 go: downloading github.com/ryanuber/columnize v2.1.2+incompatible go: downloading github.com/iris-contrib/jade v1.1.4 go: downloading github.com/CloudyKit/jet v2.1.2+incompatible go: downloading github.com/iris-contrib/pongo2 v0.0.1 go: downloading github.com/kataras/blocks v0.0.2 go: downloading github.com/yosssi/ace v0.0.5 go: downloading github.com/kataras/blocks v0.0.6 go: downloading github.com/modern-go/reflect2 v0.0.0-20180701023420-4b7aa43c6742 go: downloading github.com/chris-ramon/douceur v0.2.0 go: downloading github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd go: downloading github.com/vmihailenco/tagparser v0.1.1 go: downloading google.golang.org/appengine v1.6.5 go: downloading github.com/vmihailenco/tagparser v0.1.2 go: downloading github.com/shurcooL/sanitized_anchor_name v1.0.0 go: downloading google.golang.org/appengine v1.6.7 go: downloading github.com/google/uuid v1.1.2-0.20200519141726-cb32006e483f go: downloading github.com/google/uuid v1.3.0 go: downloading github.com/CloudyKit/fastprinter v0.0.0-20200109182630-33d98a066a53 go: downloading github.com/valyala/bytebufferpool v1.0.0 go: downloading github.com/aymerick/douceur v0.2.0 go: downloading github.com/gorilla/css v1.0.0 go: downloading github.com/golang/protobuf v1.4.1 go: downloading github.com/golang/protobuf v1.5.2 go: downloading github.com/vmihailenco/tagparser/v2 v2.0.0 go: added github.com/BurntSushi/toml v1.2.0 go: added github.com/CloudyKit/fastprinter v0.0.0-20200109182630-33d98a066a53 go: added github.com/CloudyKit/jet/v4 v4.1.0 go: added github.com/Shopify/goreferrer v0.0.0-20220729165902-8cddb4f5de06 go: added github.com/andybalholm/brotli v1.0.4 go: added github.com/aymerick/douceur v0.2.0 go: added github.com/aymerick/raymond v2.0.3-0.20180322193309-b565731e1464+incompatible go: added github.com/chris-ramon/douceur v0.2.0 go: added github.com/eknkc/amber v0.0.0-20171010120322-cdade1c07385 go: added github.com/fatih/structs v1.1.0 go: added github.com/golang/protobuf v1.5.2 go: added github.com/google/uuid v1.3.0 go: added github.com/gorilla/css v1.0.0 go: added github.com/iris-contrib/jade v1.1.4 go: added github.com/iris-contrib/pongo2 v0.0.1 go: added github.com/iris-contrib/schema v0.0.6 go: added github.com/json-iterator/go v1.1.12 go: added github.com/kataras/blocks v0.0.6 go: added github.com/kataras/golog v0.1.7 go: added github.com/kataras/iris v0.0.2 go: added github.com/kataras/pio v0.0.10 go: added github.com/kataras/sitemap v0.0.5 go: added github.com/kataras/tunnel v0.0.4 go: added github.com/klauspost/compress v1.15.9 go: added github.com/microcosm-cc/bluemonday v1.0.19 go: added github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd go: added github.com/modern-go/reflect2 v1.0.2 go: added github.com/russross/blackfriday/v2 v2.1.0 go: added github.com/ryanuber/columnize v2.1.2+incompatible go: added github.com/schollz/closestmatch v2.1.0+incompatible go: added github.com/shurcooL/sanitized_anchor_name v1.0.0 go: added github.com/valyala/bytebufferpool v1.0.0 go: added github.com/vmihailenco/msgpack/v5 v5.3.5 go: added github.com/vmihailenco/tagparser v0.1.2 go: added github.com/vmihailenco/tagparser/v2 v2.0.0 go: added github.com/yosssi/ace v0.0.5 go: added golang.org/x/crypto v0.0.0-20220722155217-630584e8d5aa go: added golang.org/x/net v0.0.0-20220812174116-3211cb980234 go: added golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab go: added golang.org/x/text v0.3.7 go: added golang.org/x/time v0.0.0-20220722155302-e5dcc9cfc0b9 go: added google.golang.org/appengine v1.6.7 go: added google.golang.org/protobuf v1.28.1 go: added gopkg.in/ini.v1 v1.67.0 go: added gopkg.in/yaml.v3 v3.0.1安装完毕之后,可以打开项目中go.mod文件查看Iris的依赖列表: module IrisBlog go 1.18 require ( github.com/BurntSushi/toml v1.2.0 // indirect github.com/CloudyKit/fastprinter v0.0.0-20200109182630-33d98a066a53 // indirect github.com/CloudyKit/jet/v4 v4.1.0 // indirect github.com/CloudyKit/jet/v6 v6.1.0 // indirect github.com/Shopify/goreferrer v0.0.0-20220729165902-8cddb4f5de06 // indirect github.com/andybalholm/brotli v1.0.4 // indirect github.com/aymerick/douceur v0.2.0 // indirect github.com/aymerick/raymond v2.0.3-0.20180322193309-b565731e1464+incompatible // indirect github.com/chris-ramon/douceur v0.2.0 // indirect github.com/eknkc/amber v0.0.0-20171010120322-cdade1c07385 // indirect github.com/fatih/structs v1.1.0 // indirect github.com/flosch/pongo2/v4 v4.0.2 // indirect github.com/golang/protobuf v1.5.2 // indirect github.com/golang/snappy v0.0.4 // indirect github.com/google/uuid v1.3.0 // indirect github.com/gorilla/css v1.0.0 // indirect github.com/iris-contrib/jade v1.1.4 // indirect github.com/iris-contrib/pongo2 v0.0.1 // indirect github.com/iris-contrib/schema v0.0.6 // indirect github.com/josharian/intern v1.0.0 // indirect github.com/json-iterator/go v1.1.12 // indirect github.com/kataras/blocks v0.0.6 // indirect github.com/kataras/golog v0.1.7 // indirect github.com/kataras/iris v0.0.2 // indirect github.com/kataras/iris/v12 v12.2.0-beta4.0.20220813060700-f91269130ed3 // indirect github.com/kataras/pio v0.0.10 // indirect github.com/kataras/sitemap v0.0.5 // indirect github.com/kataras/tunnel v0.0.4 // indirect github.com/klauspost/compress v1.15.9 // indirect github.com/mailgun/raymond/v2 v2.0.46 // indirect github.com/mailru/easyjson v0.7.7 // indirect github.com/microcosm-cc/bluemonday v1.0.19 // indirect github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect github.com/modern-go/reflect2 v1.0.2 // indirect github.com/russross/blackfriday/v2 v2.1.0 // indirect github.com/ryanuber/columnize v2.1.2+incompatible // indirect github.com/schollz/closestmatch v2.1.0+incompatible // indirect github.com/shurcooL/sanitized_anchor_name v1.0.0 // indirect github.com/sirupsen/logrus v1.8.1 // indirect github.com/tdewolff/minify/v2 v2.12.0 // indirect github.com/tdewolff/parse/v2 v2.6.1 // indirect github.com/valyala/bytebufferpool v1.0.0 // indirect github.com/vmihailenco/msgpack/v5 v5.3.5 // indirect github.com/vmihailenco/tagparser v0.1.2 // indirect github.com/vmihailenco/tagparser/v2 v2.0.0 // indirect github.com/yosssi/ace v0.0.5 // indirect golang.org/x/crypto v0.0.0-20220722155217-630584e8d5aa // indirect golang.org/x/net v0.0.0-20220812174116-3211cb980234 // indirect golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab // indirect golang.org/x/text v0.3.7 // indirect golang.org/x/time v0.0.0-20220722155302-e5dcc9cfc0b9 // indirect google.golang.org/appengine v1.6.7 // indirect google.golang.org/protobuf v1.28.1 // indirect gopkg.in/ini.v1 v1.67.0 // indirect gopkg.in/yaml.v3 v3.0.1 // indirect 接着在项目的根目录建立main入口文件:package main import "github.com/kataras/iris/v12" func main() { app := iris.New() app.Use(iris.Compression) app.Get("/", func(ctx iris.Context) { ctx.HTML("你好 <strong>%s</strong>!", "女神") app.Listen(":5000") }随后在终端启动Iris服务:go run main.go系统返回:Iris Version: 12.2.0-beta4 Now listening on: http://localhost:5000 Application started. Press CTRL+C to shut down.还等什么?访问http://localhost:5000万唤千呼始出来。使用快捷键control+C可以终止服务,随后可以再次运行go run main.go来重新编译启动服务。Iris项目热重启机制:fresh众所周知,由于Go lang是编译型语言,每次修改代码之后都需要重新编译,Iris目前没有内置代码热更新的工具,这里我们可以使用三方包:fresh,如此,可以大幅提高我们的Iris开发效率,在非项目目录执行命令:go get github.com/pilu/fresh注意,这里一定不能在项目的目录中执行安装命令,因为go mod模式会认为是项目依赖包,而不会在当前系统的bin目录下生成可执行命令fresh。随后进入项目目录:cd IrisBlog使用fresh命令启动Iris服务:C:\Users\liuyue\www\iriblog>fresh 0:19:33 runner | InitFolders 0:19:33 runner | mkdir ./tmp 0:19:33 watcher | Watching . 0:19:33 main | Waiting (loop 1)... 0:19:33 main | receiving first event / 0:19:33 main | sleeping for 600 milliseconds 0:19:33 main | flushing events 0:19:33 main | Started! (5 Goroutines) 0:19:33 main | remove tmp\runner-build-errors.log: The system cannot find the file specified. 0:19:33 build | Building... 0:19:44 runner | Running... 0:19:45 main | -------------------- 0:19:45 main | Waiting (loop 2)... 0:19:46 app | Iris Version: 12.2.0-beta4 0:19:46 app | 0:19:46 app | Now listening on: http://localhost:5000 Application started. Press CTRL+C to shut down.此时,项目内所有包文件都会被监控,当代码被修改后,会自动触发编译动作,原理大概相当于Python中Tornado框架的事件循环机制。当我们修改代码之后,fresh会监控到修改动作,然后立刻build:Application started. Press CTRL+C to shut down. 0:28:02 watcher | sending event ".\\main.go": MODIFY 0:28:02 watcher | sending event ".\\main.go": MODIFY 0:28:02 main | receiving first event ".\\main.go": MODIFY 0:28:02 main | sleeping for 600 milliseconds 0:28:02 main | flushing events 0:28:02 main | receiving event ".\\main.go": MODIFY 0:28:02 main | Started! (8 Goroutines) 0:28:02 main | remove tmp\runner-build-errors.log: The system cannot find the file specified. 0:28:02 build | Building... 0:28:10 runner | Running... 0:28:10 runner | Killing PID 11276 0:28:11 main | -------------------- 0:28:11 main | Waiting (loop 3)... 0:28:12 app | Iris Version: 12.2.0-beta4 0:28:12 app | Now listening on: http://localhost:5000 Application started. Press CTRL+C to shut down.如此,就不需要手动触发代码的编译了,简单方便。如果有定制化需求,可以为当前项目添加配置文件runner.conf:root: . tmp_path: ./fresh build_name: runner_build build_log: runner_build_errors.log valid_ext: .go, .tpl, .tmpl, .html, .md, .log no_rebuild_ext: .tpl, .tmpl, .html ignored: assets, tmp, log build_delay: 3000 colors: 1 log_color_main: cyan log_color_build: yellow log_color_runner: green log_color_watcher: magenta log_color_app: red可以定制化诸如监听的文件、编译日志、忽略文件和目录,编译延迟等等操作。修改定制化配置文件后,针对配置文件启动fresh服务:fresh -c runner.conf如此,fresh服务会根据配置文件来进行监听编译动作。结语最低的系统资源开销,最高的单位时间网络请求吞吐量,这是彩虹女神Iris对开发者们最好的馈赠,当我们安装好Iris并且配置好热重启机制时,我们也就走下了Go语言web开发万里长征的第一步,山高路远,城高池深,虽然前路艰险,但就算是莽撞地开始,拙劣地完成,也好过眼高手低而不去做,你同意吗?

把盏言欢,款款而谈,ChatGPT结合钉钉机器人(outgoing回调)打造人工智能群聊/单聊场景,基于Python3.10

就像黑火药时代里突然诞生的核弹一样,OpenAI的ChatGPT语言模型的横空出世,是人工智能技术发展史上的一个重要里程碑。这是一款无与伦比、超凡绝伦的模型,能够进行自然语言推理和对话,并且具有出色的语言生成能力。好吧,本篇的开头其实是由ChatGPT生成的:没办法,面对这个远超时代的AI产品,我们能说什么呢?顶礼膜拜?惊为天人?任何言语对于描述ChatGPT来说已经是苍白无力的,而辞海中的形容词在面对ChatGPT时也已经鞭长莫及。一句话:言语不能赞其伟大。本次我们利用ChatGPT的开放API接入钉钉群聊/单聊机器人,让钉钉机器人具备进行自然语言推理和对话的能力,所谓化腐朽为神奇,不过如此。注册和使用OpenAi的ChatGPT首先注册OpenAi平台:https://beta.openai.com/ ,由于ChatGPT过于火爆,导致很多地区无法正常注册,这里推荐使用北美地区的代理IP,与此同时,一定要注意,如果之后希望使用后端的API接口方式调用ChatGPT,就不要使用谷歌或者微软的三方账号进行登录,否则无法通过邮箱和秘钥交换OpenAi平台的access\_token,切记。同时,接受验证码手机号也必须是北美地区的手机号,这里推荐一个北美地区的接码平台:https://sms.qisms.com/index 非常好用。注册成功之后,这里推荐github上开源大神rawandahmad698已经封装好的开源SDK,避免重复造轮子:https://github.com/rawandahmad698/PyChatGPT安装SDK:pip3 install chatgptpy --upgrade安装好之后,编写测试脚本:chat = Chat(email="OpenAi邮箱", password="OpenAi密码",proxies="代理地址") answer = chat.ask("你好") print(answer)注意,运行代码之前,一定要使用代理proxies,并且确保是北美地区的IP地址。程序返回:[OpenAI] Email address: ******** [OpenAI] Password: ********* [OpenAI] Using proxy: {'http': 'http://localhost:4780', 'https': 'http://localhost:4780'} [OpenAI] Beginning auth process [OpenAI][1] Making request to https://chat.openai.com/auth/login [OpenAI][1] Request was successful [OpenAI][2] Beginning part two [OpenAI][2] Grabbing CSRF token from https://chat.openai.com/api/auth/csrf [OpenAI][2] Request was successful [OpenAI][2] CSRF Token: 1b1357a34e4b0b9a74e999372fe0413ab981c9a72e030a54b3bf172bd6176c5e [OpenAI][3] Beginning part three [OpenAI][3] Making request to https://chat.openai.com/api/auth/signin/auth0?prompt=login [OpenAI][3] Request was successful [OpenAI][3] Callback URL: https://auth0.openai.com/authorize?client_id=TdJIcbe16WoTHtN95nyywh5E4yOo6ItG&scope=openid%20email%20profile%20offline_access%20model.request%20model.read%20organization.read&response_type=code&redirect_uri=https%3A%2F%2Fchat.openai.com%2Fapi%2Fauth%2Fcallback%2Fauth0&audience=https%3A%2F%2Fapi.openai.com%2Fv1&prompt=login&state=RJt9U13ATPmlt795xMNohQZcUNOytZNvHoq3JI8HGZ4&code_challenge=Pq97ptna00Ybak2dUmIMhR3eqmXZnZz-Fij7otMMw7U&code_challenge_method=S256 [OpenAI][4] Making request to https://auth0.openai.com/authorize?client_id=TdJIcbe16WoTHtN95nyywh5E4yOo6ItG&scope=openid%20email%20profile%20offline_access%20model.request%20model.read%20organization.read&response_type=code&redirect_uri=https%3A%2F%2Fchat.openai.com%2Fapi%2Fauth%2Fcallback%2Fauth0&audience=https%3A%2F%2Fapi.openai.com%2Fv1&prompt=login&state=RJt9U13ATPmlt795xMNohQZcUNOytZNvHoq3JI8HGZ4&code_challenge=Pq97ptna00Ybak2dUmIMhR3eqmXZnZz-Fij7otMMw7U&code_challenge_method=S256 [OpenAI][4] Request was successful [OpenAI][4] Current State: hKFo2SA5VzlqUDA0Mkl5TnQtNUpYcGRBU0ZfRkhQVUY1eVpWV6Fur3VuaXZlcnNhbC1sb2dpbqN0aWTZIGMzU0xvbThRUXFxMTczeVg4bF8zRFZnYVNOM2M3Q0RFo2NpZNkgVGRKSWNiZTE2V29USHROOTVueXl3aDVFNHlPbzZJdEc [OpenAI][5] Making request to https://auth0.openai.com/u/login/identifier?state=hKFo2SA5VzlqUDA0Mkl5TnQtNUpYcGRBU0ZfRkhQVUY1eVpWV6Fur3VuaXZlcnNhbC1sb2dpbqN0aWTZIGMzU0xvbThRUXFxMTczeVg4bF8zRFZnYVNOM2M3Q0RFo2NpZNkgVGRKSWNiZTE2V29USHROOTVueXl3aDVFNHlPbzZJdEc [OpenAI][5] Request was successful [OpenAI][5] No captcha detected [OpenAI][6] Making request to https://auth0.openai.com/u/login/identifier [OpenAI][6] Email found [OpenAI][7] Entering password... [OpenAI][7] Password was correct [OpenAI][7] Old state: hKFo2SA5VzlqUDA0Mkl5TnQtNUpYcGRBU0ZfRkhQVUY1eVpWV6Fur3VuaXZlcnNhbC1sb2dpbqN0aWTZIGMzU0xvbThRUXFxMTczeVg4bF8zRFZnYVNOM2M3Q0RFo2NpZNkgVGRKSWNiZTE2V29USHROOTVueXl3aDVFNHlPbzZJdEc [OpenAI][7] New State: c3SLom8QQqq173yX8l_3DVgaSN3c7CDE [OpenAI][8] Making request to https://auth0.openai.com/authorize/resume?state=c3SLom8QQqq173yX8l_3DVgaSN3c7CDE [OpenAI][8] All good [OpenAI][8] Access Token: eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCIsImtpZCI6Ik1UaEVOVUpHTkVNMVFURTRNMEZCTWpkQ05UZzVNRFUxUlRVd1FVSkRNRU13UmtGRVFrRXpSZyJ9.eyJodHRwczovL2FwaS5vcGVuYWkuY29tL3Byb2ZpbGUiOnsiZW1haWwiOiJ6Y3hleTI5MTFAb3V0bG9vay5jb20iLCJlbWFpbF92ZXJpZmllZCI6dHJ1ZSwiZ2VvaXBfY291bnRyeSI6IlVTIn0sImh0dHBzOi8vYXBpLm9wZW5haS5jb20vYXV0aCI6eyJ1c2VyX2lkIjoidXNlci1IcHQ2SXF6R0k0RW43V213dGdzaUVOUjUifSwiaXNzIjoiaHR0cHM6Ly9hdXRoMC5vcGVuYWkuY29tLyIsInN1YiI6ImF1dGgwfDYzOTA3ZWRiMTQzYTFkZjQxMzk5Yzc0YyIsImF1ZCI6WyJodHRwczovL2FwaS5vcGVuYWkuY29tL3YxIiwiaHR0cHM6Ly9vcGVuYWkuYXV0aDAuY29tL3VzZXJpbmZvIl0sImlhdCI6MTY3MDQ1OTkzNywiZXhwIjoxNjcwNTQ2MzM3LCJhenAiOiJUZEpJY2JlMTZXb1RIdE45NW55eXdoNUU0eU9vNkl0RyIsInNjb3BlIjoib3BlbmlkIGVtYWlsIHByb2ZpbGUgbW9kZWwucmVhZCBtb2RlbC5yZXF1ZXN0IG9yZ2FuaXphdGlvbi5yZWFkIG9mZmxpbmVfYWNjZXNzIn0.PtXKhJqwudNKLIkNRc5OO6T7Tsl8ydZ8WWnCJ3Ax2c40CQibRTiGLDmfvk2gW5pVIkOpKldWYs6Jrd8UVi0Ih9VMDwS9JL6HpZKsoRaIhy6r6l7AW5vMMQN-l0ntCsgefQeGIrwtCTUsIklN8dyZDkRkympC2AzRkayAcFvFckXTHi_J5Fivr5J7We_OM4cGFJEKTLkaSw6MnYku-uYwAKPVEpFsF7fLnUBRQxn5Zz90FhdeLYEg4IUjPWKPp1iMbp_fa9qhwwtKBwogtrIVzq2t8NdUotoNYgoo2uV2xjQWC2m4V4C_xgkSzLj2TTtRJMOYKGH-lHWs2_yRQF0wOg [OpenAI][9] Saving access token... [OpenAI][8] Saved access token首次运行程序会通过代理自动登录OpenAi平台,并且换取token,最后将token存储在本地。随后返回ChatGPT的信息:➜ mydemo git:(master) ✗ /opt/homebrew/bin/python3.10 "/Users/liuyue/wodfan/work/mydemo/test_chatgpt.py" Using proxies: http://localhost:4780 你好,很高兴为你提供帮助。有什么需要我帮忙的吗?至此,ChatGPT接口就调试好了。配置钉钉Dingding机器人随后,我们来配置C端的机器人,注意这里一定要使用支持outgoing回调的企业机器人,而不是普通的机器人,参考文档:https://open.dingtalk.com/document/group/enterprise-created-chatbot创建好企业机器人之后,获取机器人应用的Key和秘钥,同时配置好出口IP和接口地址:所谓出口IP即调用钉钉服务合法的ip,消息接受地址是接受C端信息的地址,这里我们使用异步非阻塞的Tornado框架来构建接受信息服务:import hmac import hashlib import base64 import json import tornado from tornado.options import define, options define('port', default=8000, help='default port',type=int) class Robot(tornado.web.RequestHandler): async def post(self): timestamp = self.request.headers.get('timestamp', None) sign = self.request.headers.get('sign', None) app_secret = '钉钉机器人秘钥' app_secret_enc = app_secret.encode('utf-8') string_to_sign = '{}\n{}'.format(timestamp, app_secret) string_to_sign_enc = string_to_sign.encode('utf-8') hmac_code = hmac.new(app_secret_enc, string_to_sign_enc, digestmod=hashlib.sha256).digest() my_sign = base64.b64encode(hmac_code).decode('utf-8') if sign != my_sign: return self.finish({"errcode":1,"msg":"签名有误"}) data = json.loads(self.request.body) text = data['text']["content"] atUsers = data.get("atUsers",None) uid = data.get("senderStaffId",None) return self.finish({"errcode":0,"msg":text}) urlpatterns = [ (r"/robot_chat/",Robot), # 创建Tornado实例 application = tornado.web.Application(urlpatterns,debug=True) if __name__ == "__main__": tornado.options.parse_command_line() application.listen(options.port) tornado.ioloop.IOLoop.instance().start()这里我们通过Robot异步控制器来接受所有来自钉钉客户端的信息,即人类对机器人说的话,需要注意的是,后端服务需要对请求头中的timestamp和sign进行验证,以判断是否是来自钉钉的合法请求,避免其他仿冒钉钉调用开发者的HTTPS服务传送数据。所以这里一旦签名有问题,就结束逻辑: timestamp = self.request.headers.get('timestamp', None) sign = self.request.headers.get('sign', None) app_secret = '钉钉机器人秘钥' app_secret_enc = app_secret.encode('utf-8') string_to_sign = '{}\n{}'.format(timestamp, app_secret) string_to_sign_enc = string_to_sign.encode('utf-8') hmac_code = hmac.new(app_secret_enc, string_to_sign_enc, digestmod=hashlib.sha256).digest() my_sign = base64.b64encode(hmac_code).decode('utf-8') if sign != my_sign: return self.finish({"errcode":1,"msg":"签名有误"}) 最后该接口会返回发信人id(uid)以及具体信息内容(text)。至此,后端接受服务就配置好了。下面就是后端推送服务,首先,根据官方文档:https://open.dingtalk.com/document/orgapp-server/obtain-the-access\_token-of-an-internal-app?spm=ding\_open\_doc.document.0.0.5f255239xgW3zE#topic-2056397需要获取钉钉接口的token:def get_token(self): res = requests.post("https://api.dingtalk.com/v1.0/oauth2/accessToken",data=json.dumps({"appKey":self._appKey,"appSecret":self._appSecret}),headers={"Content-Type":"application/json"}) token = res.json()["accessToken"] return token随后,根据文档:https://open.dingtalk.com/document/group/chatbots-send-one-on-one-chat-messages-in-batches?spm=ding\_open\_doc.document.0.0.22e749acXECz5m#topic-2080109我们来配置单聊推送:# 单聊 def send_message(self,uid,message): res = requests.post("https://api.dingtalk.com/v1.0/robot/oToMessages/batchSend",data=json.dumps({"robotCode":self._appKey,"userIds":[uid],"msgKey":"sampleText","msgParam":'{"content":"'+message+'"}'}),headers={"Content-Type":"application/json","x-acs-dingtalk-access-token":self._token}) print(res.text)具体效果:接着,继续根据官方文档:https://open.dingtalk.com/document/robots/guide-to-user-access-for-intra-enterprise-robot-group-chat配置群聊推送方法:# 群聊 def send_user(self,uid,message): data = { "at": { "atUserIds": [ "text": { "content": message "msgtype": "text" res = requests.post(self._webhook,data=json.dumps(data),headers={"Content-Type":"application/json"}) print(res.text)群聊效果:这里需要注意的是,单聊是通过接口的方式进行推送,而群内聊天是通过webhook方式进行推送,关于webhook,请移玉步至:使用python3.7配置开发钉钉群自定义机器人(2020年新版攻略)完整代码:import requests import json from pychatgpt import Chat class DingDing: def __init__(self,appKey=None,appSecret=None) -> None: self._appKey = appKey self._appSecret = appSecret self._token = self.get_token() # 机器人webhook地址 self._webhook = "" def get_token(self): res = requests.post("https://api.dingtalk.com/v1.0/oauth2/accessToken",data=json.dumps({"appKey":self._appKey,"appSecret":self._appSecret}),headers={"Content-Type":"application/json"}) token = res.json()["accessToken"] return token def send_message(self,uid,message): res = requests.post("https://api.dingtalk.com/v1.0/robot/oToMessages/batchSend",data=json.dumps({"robotCode":self._appKey,"userIds":[uid],"msgKey":"sampleText","msgParam":'{"content":"'+message+'"}'}),headers={"Content-Type":"application/json","x-acs-dingtalk-access-token":self._token}) print(res.text) def send_user(self,uid,message): data = { "at": { "atUserIds": [ "text": { "content": message "msgtype": "text" res = requests.post(self._webhook,data=json.dumps(data),headers={"Content-Type":"application/json"}) print(res.text) dingding = DingDing("appkey","appSecret") #chat = Chat(email="OpenAi邮箱", password="OpenAi密码",proxies="代理地址") #answer = chat.ask("你好") #dingding.send_message('uid',answer) #dingding.send_user('uid',answer) #print(answer)至此,后端推送服务就配置好了。结语最后,奉上Github项目地址,与众亲同飨:https://github.com/zcxey2911/Python\_ChatGPT\_ForDingding\_OpenAi ,毫无疑问,ChatGPT是NLP领域历史上最伟大的项目,没有之一,伟大,就是技术层面的极致,你同意吗?

兔起鹘落全端涵盖,Go lang1.18入门精炼教程,由白丁入鸿儒,全平台(Sublime 4)Go lang开发环境搭建EP00

Go lang,为并发而生的静态语言,源于C语言又不拘泥于效率,高效却不流于古板,Python灵活,略输性能,Java严谨,稍逊风骚。君不见各大厂牌均纷纷使用Go lang对自己的高并发业务进行重构,原因无他,经济下行的大背景之下,性能突出、效率拉满的Go lang无疑是高并发场景下节约服务器资源的一剂灵药。与时俱进,顺应潮流,本次我们乘着市场的东风,在各大主流平台(Win/Mac/Linux/Docker)安装并搭建Go lang1.18的开发环境,短时间内做到能够在任何一款开发机或者服务器上输出Go lang的全部功力,如臂使指,挥洒自如。Windows11平台首先来到市场占有率最高的Win11系统,前往Go lang官网 https://go.dev/dl/ 下载win平台下的64位安装包:Microsoft Windows Windows 7 or later, Intel 64-bit processor go1.18.5.windows-amd64.msi (130MB)选择安装目录后,直接点击安装即可。安装完毕之后,首先输入“win+R”,打开终端。然后在里面输入命令:control system。在打开的系统信息界面中,选择左侧菜单的“高级系统设置”。随后在打开的“系统属性”窗口选择下方的“环境变量”选项。最后在打开的环境变量中,检查系统是否将Go lang的安装目录"c:/go/bin"配置到了环境变量里,如果已经配置了,在终端中键入命令:go version系统返回:C:\Users\liuyue>go version go version go1.18.5 windows/amd64说明Go lang1.18版本已经在系统中安装成功。Mac平台接着来到Mac系统,Mac系统一般会包含两套架构,分别是:搭载Intel芯片的x86架构系统,和搭载M系列芯片的ARM架构系统。首先打开终端,键入如下命令:uname -m如果返回:arm64说明是ARM架构系统,反之:x86则是Intel芯片的x86架构系统。前往Go lang官网 https://go.dev/dl/ ARM架构系统下载:Apple macOS (ARM64) macOS 11 or later, Apple 64-bit processor go1.18.5.darwin-arm64.pkg (132MB)X86架构系统下载:Apple macOS (x86-64) macOS 10.13 or later, Intel 64-bit processor go1.18.5.darwin-amd64.pkg (138MB)下载之后,双击进行安装即可。区别于Windows平台,我们还可以使用更加灵活的方式安装配置Go lang1.18,那就是鼎鼎有名的Homebrew。Homebrew是一款自由及开放源代码的软件包管理系统,用以简化macOS系统上的软件安装过程,最初由马克斯·霍威尔写成。因其可扩展性得到了一致好评,而在Ruby on Rails社区广为人知。 Homebrew使用GitHub,通过用户的贡献扩大对软件包的支持,同样也支持Go lang生态环境。首先安装Homebrew:/bin/zsh -c "$(curl -fsSLhttps://gitee.com/cunkai/HomebrewCN/raw/master/Homebrew.sh)”随后运行命令清理缓存和更新版本:brew cleanup && brew update接着运行命令进行go lang1.18的安装操作:brew install go接着会进行下载安装操作:brew install go ==> Downloading https://ghcr.io/v2/homebrew/core/go/manifests/1.18.5 Already downloaded: /Users/liuyue/Library/Caches/Homebrew/downloads/819fc08bdc0ecafc9713bdfd76a9e6901172c0b2c0cdde0dd482a0b37ba008fd--go-1.18.5.bottle_manifest.json ==> Downloading https://ghcr.io/v2/homebrew/core/go/blobs/sha256:4f80cc29d711ddc5038f6b4684fe31674df01284aaa611480 ==> Downloading from https://pkg-containers.githubusercontent.com/ghcr1/blobs/sha256:4f80cc29d711ddc5038f6b4684fe3 ######################################################################## 100.0% ==> Pouring go--1.18.5.arm64_monterey.bottle.tar.gz ???? /opt/homebrew/Cellar/go/1.18.5: 11,990 files, 596.2MB ==> Running `brew cleanup go`... Disable this behaviour by setting HOMEBREW_NO_INSTALL_CLEANUP. Hide these hints with HOMEBREW_NO_ENV_HINTS (see `man brew`).由于诸位可以理解的原因,这里建议大家用学术的方式连接互联网从而获取更快的下载速度。最后执行命令清理安装包缓存:brew cleanup go接着键入命令就可以查看go lang具体的安装目录:brew list go /opt/homebrew/Cellar/go/1.18.5/bin/go /opt/homebrew/Cellar/go/1.18.5/bin/gofmt /opt/homebrew/Cellar/go/1.18.5/libexec/api/ (22 files) /opt/homebrew/Cellar/go/1.18.5/libexec/bin/ (2 files) /opt/homebrew/Cellar/go/1.18.5/libexec/doc/ (5 files) /opt/homebrew/Cellar/go/1.18.5/libexec/lib/ (3 files) /opt/homebrew/Cellar/go/1.18.5/libexec/misc/ (393 files) /opt/homebrew/Cellar/go/1.18.5/libexec/pkg/ (695 files) /opt/homebrew/Cellar/go/1.18.5/libexec/src/ (7786 files) /opt/homebrew/Cellar/go/1.18.5/libexec/test/ (3071 files) /opt/homebrew/Cellar/go/1.18.5/libexec/ (6 files)在终端键入命令:go version系统返回:➜ ~ go version go version go1.18.5 darwin/arm64这里建议使用Homebrew来安装Go lang,brew会根据当前系统架构来自动选择不同系统架构的编译版本来进行安装。Ubuntu/Centos首先删除 /usr/local/go 目录,根据官网说明,如果之前有安装过 go,那么需要将该位置的 go 目录删除掉 :sudo rm -rf /usr/local/go接着下载安装并安装:# 下载安装包 $ wget https://golang.google.cn/dl/go1.18.linux-amd64.tar.gz # 解压 golang 到 /usr/local 下 $ sudo tar xzvf go1.18.linux-amd64.tar.gz -C /usr/local最后设置环境变量:# 修改 $HOME/.profile 或 /etc/profile 文件 # 这里可能会出现权限不足(ubuntu需要加sudo, centos需要切换成root权限) $ sudo vim /etc/profile # 在该文件最后一行插入(进入后,按 i键进入编辑模式) $ export PATH=$PATH:/usr/local/go/bin # 按 esc 退出编辑模式, 按 :wq 保存文件 $ go version系统返回:go version go1.18 linux/amd64Docker容器搭建如果我们不希望go lang在系统中留下些许的痕迹,Docker也可以帮我们快速搭建开发环境,关于Docker的安装,请移玉步至:一寸宕机一寸血,十万容器十万兵|Win10/Mac系统下基于Kubernetes(k8s)搭建Gunicorn+Flask高可用Web集群。首先建立环境文件夹:mkdir mygo cd mygo随后创建测试脚本hello.go:package main func main() { println("hello go1.18") }接着创建Docker镜像打包文件Dockerfile:FROM golang:alpine WORKDIR /build COPY hello.go . RUN go build -o hello hello.go CMD ["./hello"]这里的创建逻辑是基础镜像选择alpine,容器内创建build编译文件夹,将hello.go拷贝到build目录下,随后运行容器内的go编译器对脚本进行打包,最后运行打包后的可执行文件。运行命令打包镜像:docker build -t go .随后系统自动下载基础镜像并且编译:Sending build context to Docker daemon 3.072kB Step 1/5 : FROM golang:alpine ---> 15115d36d05e Step 2/5 : WORKDIR /build ---> Using cache ---> 09ea4177a5f7 Step 3/5 : COPY hello.go . ---> 20ff0208e342 Step 4/5 : RUN go build -o hello hello.go ---> Running in c03d13c80c36 Removing intermediate container c03d13c80c36 ---> c41673d8b447 Step 5/5 : CMD ["./hello"] ---> Running in 8f74af4426cf Removing intermediate container 8f74af4426cf ---> caf626888641 Successfully built caf626888641 Successfully tagged go:latest查看镜像明细:docker images返回明细:[root@instance-7dojaq0e mygo]# docker images REPOSITORY TAG IMAGE ID CREATED SIZE go latest caf626888641 43 minutes ago 329MB golang alpine 15115d36d05e 15 hours ago 328MB运行docker命令启动容器:docker run -it --rm go系统返回:hello go1.18至此,Docker搭建go lang1.18环境就完成了。Sublime 4 for Go lang编写Go lang代码也可以选择轻量编辑器Sublime,是的,一律千篇的Goland多多少少有点审美疲劳,关于Sublime 4的安装,请移步:轻盈潇洒卓然不群,敏捷编辑器Sublime text 4中文配置Python3开发运行代码环境(Win11+M1 mac)这里不再赘述。打开Sublime 4,使用组合键:control + shift + p如果是mac平台,键入:cmd + shift + p在弹出的命令行中选择:Install Package然后输入:Golang Build 按回车安装安装成功后,配置go lang安装路径:首选项 -> package settings -> Golang Config -> Settings - Uesrs{ "PATH": "C:/Go/bin", "GOPATH": "C:/Go" }紧接着配置go mod,go mod是Go语言的包管理工具,官方推荐使用,有了它就不再受GOPATH的限制,可以在任何目录初始化项目,打开命令行,键入命令:go env -w GO111MODULE=auto go env -w GOPROXY=https://goproxy.cn,direct然后在Sublime 4中新建一个hello.go文件:package main import "fmt" func main() { fmt.Println("hello go1.18") }接着使用快捷键 control + b 或者 control + shift + b 选择go run编译器运行代码,Mac系统用cmd替换control,如图所示:随后继续安装代码补全插件,键入:control + shift + p 并输入:Install Package输入:Golang Tools Integration 回车进行安装重启Sublime 4。随后即可在代码中进行补全操作:至此,开发编辑器就配置好了。诚然,如果累了,不想折腾,使用微软的vscode配合code runner和go插件直接起飞也是可以的,简单直接,方便好用。结语不同于Python或者是Ruby,Go lang不是系统预装的基础语言,所以配置起来相对独立,不需要考虑与系统版本冲突问题。与此同时,Golang还支持交叉编译功能,即在Windows平台可以将代码编译成Linux平台可执行的文件,对于Windows平台用户来说,这无疑是一个重大利好。

吾剑未尝不利,国内Azure平替,科大讯飞人工智能免费AI语音合成(TTS)服务Python3.10接入

微软Azure平台的语音合成(TTS)技术确实神乎其技,这一点在之前的一篇:含辞未吐,声若幽兰,史上最强免费人工智能AI语音合成TTS服务微软Azure(Python3.10接入),已经做过详细介绍,然则Azure平台需要信用卡验证,有一定门槛,对国内用户不太友好,放眼神州,科大讯飞的讯飞开放平台也有语音合成服务接口,可以通过语音合成流式接口将文字信息转化为声音信息。创建语音应用首先注册讯飞开放平台,随后创建语音合成应用:https://console.xfyun.cn/app/myapp 创建成功后,可以获取5个小时的免费语音合成时间,同时获取应用的appid、秘钥和APIKey: 该语音合成能力是通过基于Websocket协议的长连接接口API的方式给开发者提供一个通用的接口。Websocket协议接口具备流式传输能力,适用于需要流式数据传输的AI服务场景,比起集成在客户端的SDK,流接口具备轻量、跨语言的特点;相较于传统的HTTP协议接口,Websocket协议接口有原生支持跨域的优势,换句话说,从前端就可以直接进行语音转换,而不需要后端参与。接口鉴权根据官网的接口文档:https://www.xfyun.cn/doc/tts/online\_tts/API.html ,我们先安装对应的三方库:pip3 install websocket==0.2.1 pip3 install websocket-client==0.56.0由于讯飞的服务端支持的websocket版本是13,所以需要确保请求端使用的库支持该版本。 首先导入基础库,并且预设语音合成类的参数:import websocket import datetime import hashlib import base64 import hmac import json from urllib.parse import urlencode import time import ssl from wsgiref.handlers import format_date_time from datetime import datetime from time import mktime import _thread as thread import os file_path = "/Users/liuyue/wodfan/work/xunfei-ttp" file_name = "demo.mp3" class Ifly: # 初始化 def __init__(self, APPID, APIKey, APISecret, Text): self.APPID = APPID self.APIKey = APIKey self.APISecret = APISecret self.Text = Text # 公共参数(common) self.CommonArgs = {"app_id": self.APPID} # 业务参数(business),更多个性化参数可在官网查看 self.BusinessArgs = {"aue": "lame", "auf": "audio/L16;rate=16000", "vcn": "xiaoyan", "tte": "utf8","sfl":1,"speed":80} self.Data = {"status": 2, "text": str(base64.b64encode(self.Text.encode('utf-8')), "UTF8")}这里把应用的APPID, APIKey, APISecret作为实例化参数进行传入,Text为需要语音合成的文本。 和Http协议一样,Websocekt协议接口也需要鉴权操作,这里需要通过接口密钥基于hmac-sha256计算签名,向讯飞的服务器端发送Websocket协议握手请求: # 生成url def create_url(self): url = 'wss://tts-api.xfyun.cn/v2/tts' # 生成RFC1123格式的时间戳 now = datetime.now() date = format_date_time(mktime(now.timetuple())) # 拼接字符串 signature_origin = "host: " + "ws-api.xfyun.cn" + "\n" signature_origin += "date: " + date + "\n" signature_origin += "GET " + "/v2/tts " + "HTTP/1.1" # 进行hmac-sha256进行加密 signature_sha = hmac.new(self.APISecret.encode('utf-8'), signature_origin.encode('utf-8'), digestmod=hashlib.sha256).digest() signature_sha = base64.b64encode(signature_sha).decode(encoding='utf-8') authorization_origin = "api_key=\"%s\", algorithm=\"%s\", headers=\"%s\", signature=\"%s\"" % ( self.APIKey, "hmac-sha256", "host date request-line", signature_sha) authorization = base64.b64encode(authorization_origin.encode('utf-8')).decode(encoding='utf-8') # 将请求的鉴权参数组合为字典 v = { "authorization": authorization, "date": date, "host": "ws-api.xfyun.cn" # 拼接鉴权参数,生成url url = url + '?' + urlencode(v) return url随后实例化转换类,并且生成Websocket协议地址:if __name__ == "__main__": # 测试时候在此处正确填写相关信息即可运行 ifly = Ifly(APPID='', APISecret='', APIKey='', Text="你好这是一个语音合成示例") websocket.enableTrace(False) wsUrl = ifly.create_url() print(wsUrl)程序返回:➜ xunfei-ttp /opt/homebrew/bin/python3.10 "/Users/liuyue/wodfan/work/xunfei-ttp/iflytek-tts.py" wss://tts-api.xfyun.cn/v2/tts?authorization=YXBpX2tleT0iZWNkOTY1MWU1NjA1NjMxNDAyYzAzOGYwY2RkY2JkNDIiLCBhbGdvcml0aG09ImhtYWMtc2hhMjU2IiwgaGVhZGVycz0iaG9zdCBkYXRlIHJlcXVlc3QtbGluZSIsIHNpZ25hdHVyZT0icDN1SU9Xc2RLUG1aM0pJanpNK3RYcXRZOTcxcVA3cW5UclRubmZRQ0dCMD0i&date=Tue%2C+07+Feb+2023+09%3A10%3A49+GMT&host=ws-api.xfyun.cn至此Websocekt鉴权环节就完成了,讯飞的服务端将发起握手时会对接口地址中的authorization参数进行验签操作。语音流式转换随后,我们可以发起Websocket链接了:# 收到websocket错误的处理 def on_error(ws, error): print("### error:", error) # 收到websocket关闭的处理 def on_close(ws): print("### 链接关闭 ###") # 收到websocket连接建立的处理 def on_open(ws): def run(*args): d = {"common": ifly.CommonArgs, "business": ifly.BusinessArgs, "data": ifly.Data, d = json.dumps(d) print("------>开始发送文本数据") ws.send(d) if os.path.exists(f'{file_path}/{file_name}'): os.remove(f'{file_path}/{file_name}') thread.start_new_thread(run, ()) if __name__ == "__main__": # 测试时候在此处正确填写相关信息即可运行 ifly = Ifly(APPID='', APISecret='', APIKey='', Text="你好这是一个语音合成示例") websocket.enableTrace(False) wsUrl = ifly.create_url() print(wsUrl) ws = websocket.WebSocketApp(wsUrl, on_message=on_message,on_close=on_close) print(ws) ws.on_open = on_open ws.run_forever(sslopt={"cert_reqs": ssl.CERT_NONE})这里通过on\_open方法将参数数据传入到服务端,基本参数默认值设置了语音输出格式为mp3,朗读者是xiaoyan,也就是讯飞小燕,语速为80,默认为50,语速快一点显得没有那么呆板。 随后讯飞服务端会通过onmessage方法将转换好的音频流传回客户端:def on_message(ws, message): message =json.loads(message) code = message["code"] sid = message["sid"] audio = message["data"]["audio"] audio = base64.b64decode(audio) status = message["data"]["status"] print(code) if status == 2: print("ws is closed") ws.close() if code != 0: errMsg = message["message"] print("sid:%s call error:%s code is:%s" % (sid, errMsg, code)) else: with open(f'{file_path}/{file_name}', 'ab') as f: f.write(audio) except Exception as e: print("receive msg,but parse exception:", e)注意返回值为Json格式的字符串,语音流放在data的audio字段中,随后写入到指定目录的mp3文件即可,程序返回样例:xunfei-ttp /opt/homebrew/bin/python3.10 "/Users/liuyue/wodfan/work/xunfei-ttp/iflytek-tts.py" wss://tts-api.xfyun.cn/v2/tts?authorization=YXBpX2tleT0iZWNkOTY1MWU1NjA1NjMxNDAyYzAzOGYwY2RkY2JkNDIiLCBhbGdvcml0aG09ImhtYWMtc2hhMjU2IiwgaGVhZGVycz0iaG9zdCBkYXRlIHJlcXVlc3QtbGluZSIsIHNpZ25hdHVyZT0ibXJwZmVrTE9nMFcrbjd4Q2hjYWJCMG14ZmxRRTBnbXJSNzdhUS9HWGp3OD0i&date=Tue%2C+07+Feb+2023+09%3A19%3A26+GMT&host=ws-api.xfyun.cn <websocket._app.WebSocketApp object at 0x104d47af0> ------>开始发送文本数据 {'code': 0, 'message': 'success', 'sid': 'tts000e2154@hu1862b2c44cb05e0902', 'data': {'audio': '//NoxAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA//NoxAAd8KZ4N1owAAwU5JM85W4amaEtOtDwyyA4cU48M3K0z5MFD3uawziKd3T09PbzzpwAQcmTJ60EyZNNMH5cP8oCDoIHC58uD76wQBDxOD6gfeIAQMxBhh2XB8H3w//E4IAgc5QEAxiAMKdE58uH/BAEHAg7xA4QA+D/+JwfgmD+UVKACQW245bM04TEIhGpTQRNrL5pAiIAYSQlHAKiaYdAYM47AJ0oUmWH38R7AkuogKArFqiq08pdHRi5Qe++NuMtCpGT3PubZFS1XbycmA1EYaaX//NoxFNBxBa+X5nAAjUPvy/qjbDnYZ/LZM0eBDmmGUTXeFCtLy25cm5y7dldJSz+FDbwd23AMVfrdaUUdPCqXdFfjEP5Rhk+vlfbs3e1QQXFn1vasf3/w1Yp8aP30a5YdyWYXXbn4bjNucxlruVJXLpRcpbN7X1+Wsb9ipY/uLO6OHInlh+H////37GMVl9j/////+3jdy+pQz+Wdul/fNVKTlJ/01mL50lvnNUf3LXNX8uf//vmseZ6/tPfxuY0fIIABCRGqwAvpML35Vq1rK4/wwOLklm5//NoxBclgsa+/dlIAFRmRzXfvfXfVljK3HDGf/3G3sCtGjb//qvfhkJ+MYEV+d+qSUls9vYf7LWMl//fv+M5ZX1AXI5xEgOc0YeyTDiPYKptzi3ax5BbaGKJxBaZxGoVIWEXfGDOHaQ2mzU8gz0K5M4hWlFIg0QHkB2k6kf1+Uu7qZGxDFEp5J4AmJp1gGB7AArXPmKuE/qnqYLCA1AaH8ZEStd35XY3na5IHocJrBcsFCJZhe59y9+rspWKVQS3Ltx/4ftv2fDxmUzCSYfigcIkB5vMe0dp//NoxEw1i+6iNNPHXVEFZm+Na1bwWtdyCRnEoNYziD6wTlTROjvUjysjM2nKaJ0oahpzJs0VpxgO3sWz05S2qlqIUJCWRlrrztrK9xDhK56aHT96+qdUKtVrK1KpQxdVg0hK6Nb6VzeyP8QnzCrWV69ewHzaxsE1/7bzv//evyh0tnL+HVsQNrD7qReRsbLEc2y8tQzgCiqJcwLQSqmQy3d5FPtJoatquJGYkKP1OCFClK58icNdE9GNFuTydUMK29yPnrNI3K5xRLyOXJOoUxdsVzbpyfHS//NoxEAue/Ky/noNjbplfnqX4vo/E6xRzhVKpVVrbRXzRIsPBqBEWlpt7XnLD0xdY6WhndhyOtRmq/XjQUnmynNT8YwWbWBNVQa+aDMZBUtHJmQ/xJFeRL8rH2SI05Jp2PnZ9VTK+xBMpB9pz2zx8aomJBHs9vER/y2UScnFkKlONjHOp0flsBBwB9vmqdhaG9icCrAIw7gAwPx4p4bfJqEaA7xM1ASiO4HJBbZapB6zsafJeS9tVcUt5TGWktomx7iFj6BUAWz8SJRgeHoskwxMk3EiJdE9//NoxFEuk96NVHsNZUXtFKh1CzU+mT1hUhkK2AYkTJtJiT1sv+2vrKZodDAGA09ZMrCrcxBz/s+NmpqmVrTLLJyxREZaWiezzZW7XN8+cp8++u1fP6347TPcJbG82re1Ps5u4aXVd3iWOXN5VkQOHR53FFBpRtWbqJe0VlE3sv/sXL7gzEQVRQjNNINSBN80OOIMAHLigoCztSgtgcteKrAjgTb0kWpNKSER8LLgYWosnCmuX8lhKFMeLM8QJiJj1Ik6MgKEIlNZR9gTtLGUEijpgkFCMNo1//NoxGEqUsLLHtJHRAAAjTISfKju+Y+hK5uu0SE5wjOF070bnP/8ucUWhshQnTPsvl/v1tCKI+tdOQRTZDpu8pZOCbNAR3kKdWf4oMsBE/T7HhgP5NV7ckbJIA3SPW44/NQM3EwJcyko26Jhr5aoEeCECWBwQKdplKNoONhRMh1UAQedNBCMCGopdoGpfP2gyjaiSBQkDInNbbWEV3KgxrsqlcGy54H/iTg09SOTgNQYcUvMqkF1iSsGOPWJzV9ciZDI9b3TXTc7pHEw4ximLTenf9M18xSW//NoxIInkz7C/tDFaBqtN4NndWMZTLZJr/e0khKsRlQZAzsu/vJKtdsVwQB4pyc3XrytpZBEBkex/JRYaIF7I3snpOStfJ0OxZq0bzgoKmADRTjMuqRE2hiU8fQmRFiLE8USxk0VxWpACSLo6hCUQVFwmh81NSOHEUimXDM6aoG5cMEzJS3qovZ0bG9NtdSOk9FK60ejUuy2Wy1s6C//6Sm1tWj60Wr72uqpk6aDJ7L3+tBPRUZLWXD6NasyPnRqH03an3rQOOP+hQIu1GKKEAFKzRvt+jAj//NoxK4n406lV1iAALrCAGPYWxwDExVGU0AbEgQyIoFopLlo7tqiHJr7YCOwVWQZBIDTEMgLCTTg1mIQBbDQ2cAy7WVQKoWU0Xha4zqNvDIE9H0XpDkqEIGJzAyVMB/mkrhcRtGCwXUTzXSXnlykkUGyNZnoDq4x69GIfa3KG6O4/ivXWByE0kPzusUDQ3qSW5x+5flUvpHHlEdnb8UnEJQCguZbKz4GbRB9S2Xv5ZlUD34flta/jhSxq7ff+ku/M2qWszZAApctSkWuoBEQeBSt82XzlHhy//NoxNlNrBZU1ZvAADNzdi7rlulmKfL7eeVn+YWa+9YcQYCENYR3QrccCMRIPAgTEeWXbWg9Y8y1n/P/v/u5Wzpe77rK1+dvnOY65jnru9f/69m6ABTZlqP6W7NopC4DmlvOJDcZd/s3avK27/eu6u7w7f6x1qEsiE7grkQoYXRvspkXsLajyWx5qGMxgtcmk8SMQtDleUI8o5J4BfFwS8PxW0Aak4CpEdPIP5Vk7IszJR6F8OtZYBxr5JzARxoxz8HFILg5KhkQ84DKT5Gy2Bzoej9BB1cr//NoxG1IpBbOf5h4Ag+CgNA9lQbpfm5sTagYktGenghj5O2EnRyGIOG2LY9c1o5Y30dufqRT10+VqHuzTLelzyJYpSaCuDEEzgPGd8pG51Bom1WhauX6S78NqZnFyUlobi1RAlZuNAmiEnOb514gnOmUIGkoB9djPlMmgvKidWocnUILlDV1qQFK4Vmcuo1WZEpphIyXq5HqtDFXWBZkjv38dru53cjIeTvIkZ+wRJ8OElPCZsK33UO1TCfRJ5YM38m6lYaZSYh/SQA9NTv7qmttcY4WyC4M//NoxBUtcpLDHdyIAACEQoczrJYNZA4RhsgDBetJndmON3DgsYEFxjYLmQwCBgoKYMuamRSF+LQAYANAAPgUArUhSdqOk0MuLLFniC40hShPlhMwRJMvE4XDEgZSLZ5JFMoDMFwuEXNyGHistVEtFxAwdOjSUrUghZavd606H1KQa/XdepBJCfVZlqUp3sqp3syLJHVAkIROSaATU2OmyBIMaRQGgaBk7/pxEWBqyALAeADV7dfe7vKnGlltxPCCq4Y9O1rHe5J3ioE6mXT+OX/+4+zKhpc7//NoxComYwKyLMJNSDr//OPvhSRqNbH+7SKlEUpS2/6LlCbbd0LKjBCBoKq7nrb8JpeZUo/WvEpnJl2YDILRikpnGcGPRqZ/jD6Ls31VUIhDc/3YQ7IrUpa3SOJJHWyWVSCLuoT/XfIzUnRoPAkPESm/IBgWEwJxRdjvUZqshspuRokEu/MtO6KAJICI+h+LjY8bErg2TDyy6+u3FJNSuavZDmlsHaWm26Cmdjd8Xw+XrR2rB8+oYm1RKh/NZj3v1DWo+olfRPyUmcr225lavgUxuwAgUA4H//NoxFs1/BbKXlhf4syofDUxyiZr4/y6JG+k+finY8u94alQr9yFwUisVR0Q5po7HNPEePNYeN8Bvy1z+mrw9tbGzNaIZWe/+5IijgL51IAsacOkfQeRPhOSYZQsOM2z3PRoYwQYAoAN0KV79XneP854xhkvC7FgF4A5iOHOKaG6J0LaSdfW8N7Tut/vt1aRICn9R2V1iFgmBUUPiqj6cEoDypp9+Or+4dWUnIl8ZRuajDtzyGpSS//+aia9YHCgrjUkdQ7a5mShcUUbP/PCm6irI07tP/////NoxE4qDBbaXkofftzVoTSCpw8cw+l5HOHo8kYbNiErjrmEQqGkczXbQo+4jn7+r+mv7oyDxRr7vqLlbrLGg6I4mHlYbos80RhMeJ60miSpk2/b2BLMkHTm3sCrSs7xDZtxp2PssypXVYnaSVpFFvwzXPVRDZ2shYEKACYLhSYqkee20KZEo8Z7BOajleY5EcYEjahqq4u4xFrRgABRgUADEXfnS6TC4rC4bTahcoT3NX1AzNRC22fbn/dkYqZUFKWYwQUJHcgCHUeAq7mq+kjhRNSoj3Xd//NoxHAow5LW/OpE7mzn7+2oltKq0xjGAhRSzdDFQz2lYrgIxigh1VAxqG8xn/f/9fzGM4VwwoWCpmp9QNZ0FZlWeu9ZdAQXdDjru8zRDMQAGEmHNmvamArmqTHPmHDAAYS4wFOjKs36UdDqoJnLRhmKyy5dkMXhLvR1pqJPjlJ5RctSL4XhxaNTm0tkthXO3Xnto5WhnDbeUqtKzoYOCqop2uV65jVE6GLVi27+av9S/m6l0veUEBbgJ4NAqz8S4p/4pvFFxvjEVUxBTUUzLjEwMFVVVVVV//NoxJgiStK+/tGFDFVVVVVVVVVVVVVVVVVVVVVVVVVVVXCgE3JbW0m5beW6Gdgu4agpzfKVs7UeACCtsiCEQIK3ag9BKqCjiJaFUST2W/6IBinly7RKVdKJzuMS7ZGvTBif//h7MaswUBYCChUCoQAzUhKg/7/9LP1nMKX4NxdPdXEf90vmiKQUKKUGLGhgIAnE5QVUwSDDMfOJb/5VFfqZYbVMQU1FMy4xMDBVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVXy//NoxLYgYsa+XnmGmhLblkZVvv71ldeEapLlA23tfkwl4ozF86lS08lBez9EKhlpAnNJpZNHyNHFpOyNGjDRCRB95GTtKZVz2shP3NO2f0otvR1jaBj74ZtQyqJKQTqEIRRiAED4XJwuby2+j3iED12mQQYxUFtJpNcFTQbbTdm6FV/uW54DvLQfSlYHmDAwC1qkOLtzDTpsw7FenW0C3djL5xRMQU1FMy4xMDCqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqZEA0UkkK//NoxMwl+qqxvMJM/lvy3jvDkwjMNDnGOUDL39IJyjxQYGM4iTo8/jSE5KJKfHq9YSBKG9KvVaajcTX7PgbiSZ027StZdtS72n1Jb9Rf9nltP+/5j7uaNR3LLohiPgJwJIUkIO0lDuBGKx9BGBGHwnWqgeJJpmqNJXKltb/unu9vbdvt5O3V6T6SsuvBsJmXjxfaeRLHu7mMqEdrCaiPoYhZ2ipMQU1FMy4xMDCqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq//NoxNEnEqa6XMvWdqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqmQqY25LEm3LOtzi5oJh+XA2VIBaKqIQkZuKh+d105EmIz8RAqPR4iKpW6lFEUVsUrEESh0GFRW4iKt/p1EWmN1QxntcqGfqdkfLKUYBrsVkLfb39HEXlq0rVZblZCspUdRX+VSmR/l/TzGMYxg8zOXjEnshiUbZe2Vh1JIAOOOxttNGEUDoZmJMv6krsx2KNDi6IACDjhe1VaXo/yQ1IgI9AspvgabKkrEXgdsVPyvf//NoxLAfA2KuX0woArQGWvd2DWlsyq6FDuZb/9I5zDcAIQHQUCXe6asoODVz7rWJrKXToDGUBLTXaa+TC3kTFBiaCW3m0m444rLSz6A9VpasIODmFny/ogakUX7lkvgWSv8/DsPhVZHGGWrrfJkal4OQ9rtvKj0mtlA0pvTtemlL2wBXiNDTy+QWL9VOdNdKyGWvqWUsPyCA3fbSNNeqXqKtNdjUWr0WeGHcdfvHLt3ti86bT2mVZfIIbft1KGH4Oht+20sQBKpd12pdEojVa6zJlUrsf3HH//NoxP9NlBZ6X5nAAJUx3znN/rvP/XHTYOnRJpeg40zcOVZe7EU5KLG+fm+7MGeUy2y6TFX4jiJxZ5psCwxE4c1Wi0qqi5ro3lqDukl1BHZU4aoUDlV1QGTgpp5sfg5HY0pkiwbSWGlHa6nnXotEIA18MNSDZZql04PAjgPwBx1A4a/TSODsJg2FpPVSY973vNzids/e9lJn0K2tY2LtvaN3b7hj/qI/+er4uZnjv3/f8V1MUyXuNNr9jGb7///+vl799smeXzT32yX//++2Xvv/3v+Y/+L0//NoxJMos7bC/9tYAGuONstrqv+GcWlB42VT6arAAYNABTX7/Vd91h0sgCAFCV1Xa2sOnWm4iiBijpFnIBwKEBDWGUtTzBBos6Um/WGE9D6X36XcnUFSMCcnaXT1sTjYuMPmTCQQgtgxDvZ2Q12eBez2ZJnFbnpvGOW+Pdu/73utvu71s92+/P4u2b+IhozlEEOViCjXyPefs3u//Hu7Z0JPJpAgwDPjIQzHcmhjRFkyZsQFw/IA+/EH/vn5ucpMILzDnVQANcxp6zqmMEEBSuOuyz8ySShV//NoxLspuvKtdsPM8IMtFOlCs0ABos0j2AU4RIcyxV3Nf4L2HGADVCIU2TyygcFFxIwFDFlEJ7OGhMTdhMxHdRtciqW0e1DE9R0Yv4PNCAwt6ZIYlYLHkhKYQNEiiwvuC0GZdqYdmC4seJlD7ykWWlCJdEsbm2RR04b2ibJjT+jPI4mkRImgZJ9RpXxMxK9iP+Uv57GkYpfw3MiNZTKUKl0oH1VNRQrm0aCxOCXUNHJhWjQUh1KdTlsUpNx6ibHaQohh8F2HFGWTLeKEtqiSKkWL1ZehR11z//NoxN89W0aV4MpfiOkJbtCYYuDu3GpR+8c1gR1XZ508h7UKoAAB0AG48r3HYaOlbf46DWVVnCcSDXWZTYdiZfpoC+mhF6k7hQVW02nDfQDViwE3qECOQhAdOGW5QHEXFkMmd6FuVA1xMNszVFhmJhQFkzU2mjISsTTpmz8Jqy2p2bGmnYRpjt+TkyIsFNRJYqjiqNg3JysmTdfKqjlz5/3xQukumeGyVETKpLNW7yalVSVzP9WWjloCzUSrRREinTiJDGdadIWL99r/Zxlnv/f/WM+oSVWf//NoxLQwe0qd9smTiLUz/CRpwssEUFiwdFopTEFNRTMuMTAwVVVVVVVViJZgoO2xSb5EI4WUo41jwwPMsoWOKONa2YWsVoRSRUGpuNNtRU28kPpWVXUk7aBOUHwsLcWsMzC3VrB17LTcM16zrKs3WrIbBiWqUtCtq3NCiQIxxJEMjzP//82xWyhOWlVT3mZpKlPUoOqPVPHBQJhpf/K0ywv+EypMQU1FMy4xMDCqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq//NoxK4eet7KPkCNyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq460pLRyW7+J1khppUeQ6hUJrxN5CYa8k3bI66VSIWE2kMNzVXeUcauCdSLFTBMBXL/yuRVDQmDrTXZCEd1WQhBQ+iEIRjn9EZ/6PYWEA+CiCkZLOXpef3Pf+uTJinAUHFVwpCi2QFmskJZ//O0o2J11TJuBXo/2vEMQKWpbTIu0=', 'status': 1, 'ced': '36'}} {'code': 0, 'message': 'success', 'sid': 'tts000e2154@hu1862b2c44cb05e0902', 'data': {'audio': 'TEFNRTMuMTAwqqqqqqqqqqqqqqqqANkqDgPQblszHATjEIT8kCIT1pKXyYrqD2MUh8qxhf/zaMS0IANesZ5JR3LlQUikESA6SmFwDAiOnxEPIUA+hI2GESZwmoRIWVUIOD4ASDcJtar3U0hcPgMFWcFKaSKsTjFDH+NoWY1IUqkIpQksY5eRVQ3/8tmOf5KLDRYmxpVD4ylvj1WpV7qF7tdrxQ0kzLJpIjTKd+JKhiQ+/GMc3yReMc+7m3/vltdDNYmagk3ioINeCccKTyPOlaRAIyAsU3RnAACBACJAYQRpFaflvyRMJuK6jnvc084EASjaXQAw4FQACjSzXaS3c8xx80T83yeJU86sI//zaMTqLXtmal9MSAEkl4hGHL/HfNLAxGkjVV+KtuXsNRgMYVMILChCzAUSp6jhuugfHVrgQKksYVQZI2ktJMqsSlV2gRzrcwr26QACyoWeVRMGAFI1rWVbcSlT+Q5tgkli+IJCGXHGDGIXLJW2YYaZAODAM9Ecfq1b1atclkAuMrhfkjdynL8hUIraIyxnxkHgAeNCAQHBQVR5NLuOOt/jqt3Hc9DkOcza5KYvUlnGjGBFGNBJVMHAAl+gAHVbDhgRQQKVVW6nU6f7///L9f/9q4s5QHpsq//zaMT/U0wV4Y2b0AADX2/nI3R1Yu0tm7uQlACukQAURX5kxe5Zy+WCoJUha7iMpYiIyQEEqELm5++Vv/e+Vst4a3Wy/T9xjGOOoryJMES4o3cfjfyu1LJBDc+qYKAebuizlkuu1d91rb3/JXJ1ibljtmXiY4DHWkks4Y6WA4wNFBxoM7hxicXqAfgiHSH/X/v++RANyDAFC9c7rDlb726WX3ILZfnn/f+x/O9xjMkilFJ////+/YsXv3+3vgth91wHUgT+15fbnM84cz5agidh2G680+ksTf/zaMR8QFQGfl+byAFmaqBixqhj8qrc/DDmsOf/aiJkYdTVqpLK0vxjDDCIh220SEh1R8t/DDjOimOk7DD8OvclkP5WJf3CNxufvMidqPxmlsb3Tyl9ZVWw6IyhIx/2dtPiDEIEa2xdpi7IcvWP/99////5/////5c/8O9wz3n/6zz7z/tbx1T/l+HLnbFi9SD56oQkdrtd43Jb/1P13U2Y0jIoekX446Z+nmcBmQDIYenrr6DRGlWa+V6pmuc7jfW0/8olE7ZQn5y761jGtDCmlbv3r2VdsP/zaMRFPlQSvl/YwAJK5NT4XaPFpZbOnltylhuZfBxAFD4eZ/Vfy22AshB1+lt3o5HjV4iw1mRS6lzpXOaARCgeM0kik7P2NFQMM24aoqnL7A2+1/Nfcqy6is/3+2H0g7eXfu0z7Mke6LY0Vau1hmYCGsLIX0lkMwVDr0tCh61Y1bwdBk9LS//87evby//p8pulq87/9/96/nf+5vuPdXbXMM+Zd5v73c+3t0+u3c9c/PP9/rd3L/t91jr8KmNvCtbqtDa+2/9Jamvd6x3d/S3GC2+85zaS6//zaMQWLPwC2l7Cx5b4QrSQh2M5YytKOk1/19W3BIhYX7usaBeA8mbpL3O17KTyfP01fevaOj9GqfC9avUabr9ZT95QH0trqUjil//vHpdv1Voh2O9kPD4OermHksOjsm/JYCA07P5g2bs5hiBtvja9UuD0lPsYOwe5P/OPYfjVmyFSeN6TmfBoSCaPyUfbz/+Ru+UKkZJ7gitorPck0Njy+WeRnmXLPUcB35A0XtLIyAft6yy65AAX4BYYSzam4ZqpdJl5ppSvDtCxN9tipW0R6K8fe91/qf/zaMQtIasqwlx5y1p+jz/FtI4LSz6TxJZlm5QHzt6EnXqhU39RqR+wqB0XbqVCXXNQwka3OFI+xvmDb9HN/NIm+qBEDoknc0Cl+yfkM7n9S+5xMxFgjB8Lz44MEGgAUcIHIvo98KqsAlu23VoEuMcWJnuGWZ6ohwC2C0P/W18sDY4QkMcX+Nbj2s0LGGOWlN2z7+taYi4xnxoEl27NH8OI7zi17qtEHBHOADAB/3bTuQ+jKrws4GeEBjn//9nQgOzSK1ZLtRCN9qI16XRXqm5WfW95VKr2Df/zaMRxJOPKwl54k5eMLpNug7ZZK/JZKVua1W+fxK6P6XECipg2mgV2wUbk+mwmauDVZLJW0ap9W8OWo5qmlTDjPQ83IrQaeeY1u3XlsEjxYdm9gZpGQBOurEcsp+YgA0IL0W87lFTQFt0n0Q7kw4EualFJSbuX+jozaNpzqXqsoiF7Ou+CARVrW2rSufhyDJazd3hCtATfj1Wnz5jUxoVyXrOs88q0MP1MFwy4kjfGMSKQYwpeUDEADQMrXOttY/U5Q/XUXW/K5m1SXJ7J/qaRqEs2m5y9q//zaMSoN2vasbzeFx5Ts8IBJ91E5bZwA8f5QP25NVZYxETal9vh8FJtbe/njn3c9de74/iYj+uuuHdf19VcvmUJfM3YjzW77apmW09WXSSObu//6UicYRIUyyly27EhsYIZhzUilt3fM+6t0ANMTxgrle/HbfV6pRKd/FgYpG97kbIc6dah/EJyV6vDrcXCI1ae5vl6O026zI6fKkWQ9J+o1I/84bEv0UauOmDZzW+Ng0b///mnfO46NRJHn9moOj1+ofm1xoumLhBk3CZm2Us3NQaVTEFNRf/zaMSVJBLu3l7Lzy4zLjEwMFVVVVVVVVVVVVWwAAlLaOyqpVa5q7RQFL4bcxWs7dAaOLBtvRmizZo/KkXBD3kOHG9NvcjtHQpPqzLmJZwOEQlubHC0XKIUqeO4zBVnhGp8bh7y4uAUMauZng8iSCdyjIenktT35259eutywVi0yx4BLGS1wlAZhx5glqfsXwtfRh7eo/S41f2LdaOautFF7WXqTCDELn7se6TSy9K8zHqjInTwYTLHDbmk51PNZmj/L48ZxkoQOpzixY2FqErmYvJ+KFoTz//zaMS8IdmuplzTzJzaGxhW40vRcjRqBUy5tHlYzWdPLBd0qViGl7AWiegFxeE6cEBkck1eiyzuEFXkYnu0ZwtOXpjX3OF7mxvxXlzHnO889q0s/1lKVRra9xlOLjTS6usT8e3VLRj2k06i/oF962Nf7itoFK2shKKi0DhkXswInjJql8Y+bH+o1S0vtNsXFpAgc2oxpuNrMFJKykva0aTfdqSFTEFNRTMuMTAwVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVf/zaMT+MmvqdCzT0r1VVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVUhUY7Zag7tpf+wYp9QHByil4cVYKSGgFJZz7KOaqChQ5lIHNHo0KLXX+d3zotkW0alCTz0gCbzczur52NQ43bDm2ntrFh/SP2YqWjfzc7TJO5QbvqnKX7porNKlXtltp8lLHDFY7HTeqt7KoMWhjGKUUgXV5nkcK5wmYh+hTEFNRTMuMTAwVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVIqoZXaGIfuju/9TsbXOYGJksoff/zaMSxHzO6rv4wxaTHR+0wbKhmqFxioeXcQ5lqxAfHBRxtZGtpXDK1zL8dtrxhAkRAyfDt5xt9HkL/rOZMrh2908olKw7T5RII3/dWIDa/CK/b1CBwEffhwOPVeQjpHmVBgg4opyMyojUbGue6Equtu+2TZOP3MCCDoE7u935/rQ97/+3PJtFEEBODDmyjtXrQhm5MmaEgBMCdcT3+PdDHB96FrLAykGXNxyrnnuzNWG6AUkapSBtbLMbsnjbAgCONojBtA0TM461FNr89KWsgBoDXAlABWP/zaMTfKqKWsx7ZTWyKCaIxYHyPUf5eR/GqS+zQoUs3umZhfKhXt8FWrzSskKH6M4bhfjIcFRJqAp0ghRBFCf5Ly5s6vunIOT8UjhVtmgMdqK+PI4MkZTyq9+rVGYbSqIaIOd8SxTrqirbZM08aJGYrtzXJpNtzadNI8BicXuoc2IETb313DaoMsOI+qn9qZCVfFT0806+z7Y30sWSt8UuulLGjxb4iZzJKoPud6CrOjcxUcTIRdLyN1yqRgBmAMmSeTIBBgsmXaIl10tBkNdqCi+YjpwIOC//zaMT/OEsSrj9aeAAaIZQ4dGmigWYSDDwEGAT/gVAb3kim9bWEARCFCBshV2qwEiXYf6aRBcqhhEA4rBmGImBPM0bCl1G5TQQQyltIdhyCRUsHKxkMqoYccYYXKuLbhlTVPbjptxd29LE5H/FgBog4CLo1CwHBwkTnSYlK6He+Y/+LtPwxstIj+xBiTP3mltLSxJ/qtm7vGtlvmt6+kfBBaKZ5PzCqSOznLDhSRnUj3MT9eHXZv8u1u4713f5fMqUNIhbjR515+Yfp8qalluFRlVNGp7Kmr//zaMToSgPmjl+b0ABp9ZbjjrLC7lTYbxxwq8/f++rl3Yn8/M0lSmlViPv7ZpdxXdNJYhST9JKefNQ9lla3TavzUPfS3+3s87mlVqKqbdu321+2+2trtVZrGh7+8OXCqEIWuVUSAJzGHtbCxyQbA1/iRWIuuu5ozuPYzVeaBjgtnLYRpdTSGn1msLikDuWXYcB+EFkynyiS5lE2Qua0CPyd/H8a/QPIz+BHUpHhZe3NleTK2TOxJ852fgefqtfyhloKmcdaHb22ZZK5liMSduLu5Zl7+RixSP/zaMSLTjQWul+YwALwOWzhrjWJdI5EkRE3YdtnemWMMbTKHJy5STkXt8fu327GOO5IMrTDG0xjdhQe7QyyxKKFxoTK4AizB43L69e3LPtTlfOphh7v5LPghxGUPa08IBDrr4QA/FqDHAg94Im3Jlr7wXDesJdVl1PL3cllJD+rDX5fUlcrv3q+eOfWuUjD5plgGW0+3DlV33+SEbRiD8ZvI7cXsQYrXCVL5FDFeHIbhhsUPx+xdlkO0nKKE4NUV4+aIByQ1wcDASPDeh6wxv38iK7G5mWo4f/zaMQdK2LKvx3PeAG2atBQ9eLm1QH8ZfANyTE0COFCWJkof7epm+RjgQm1nc2NLuENSMkGlMyKu1WBYfIiBEgvMVapoNMNryG91a26+s1t/eMf5i/1rT2+be2La1//j/4tq3/tr/G92/9a1+N+FWC9esrCyvWGNK9i6xa3//tCjPxdLi57oQVFN6z/CvxYor///eFS+0ls+aE1/GpajWllRekAly5M1K0sbiBS5q5YISHcYhxgUHKCAE+bdeFThrCpoTTTVlLAAgUbCMcAwaGGZJoNVi9zMP/zaMQ6LlNOolbWBvBWBShgZesFIHphQdKohEGHpyLxc9d/thbmwpxZ5ri2EfFLFVF6hA4lH3ScR5IHsyOPWZZhXqwYlKGbBVy1IlVSUiVS+/erbdfkKe2agYoL71x0vbXRMzthHl5KuSzIv7PZu+fP/n/l5l4oK1ivAJQ3Aw8xrG4nOjRsnRNRp4QmqpAIvMyVTSUM9MOgTUpw2IWMlEyoCq+ULMsfTYjgEmJkISHC71OyYQYmiIZWCAQeEQIX4UEIkAAhAOcDMANDwtWRocmJwDaRB4YqHf/zaMRLPGPOfUFboAEDgJQQVDbQwCOcOWRYMRBj4AQ0G9Q1QKEDIAoQLCQMoHBsSAoJFIipFkd4XxIqK6LnIcSA5ovhwFklBlC6USGF8tTpF00HRdzStkEEFNZKs6m7a0VGF0kVp3MDhedFBSBeJ4rLLpkXyJFg3KCJAzArUEDjoS8aomRslUZO19T2fQSUte3UzJstSloUnU7qW7166lMiy61rRVe1Fkbe1eY6R5Aol5xJioCABMCMrwRubpJmqh4fFbk2quUyKjItlSJOYYHDVpRKjHRLuf/zaMQkNXQKtn+YeABdHnCN8CFDBcl1hpleVxvJcN1sYkUrnNeY9Z1BYNZjQW6Jn3tGcnTzEdGvFUi1mLAjwWVrjwYMCNiumB5vv48OPFMire1Ra2x6z3xNm+rU8sfPvDaolceNq9oe4ua7r/Aie+Pf7pf9/FjfUzPC7Im7+9ff0vXdszf0ia3bN//rfsrJnhyIyqfjudLzY+NXrulfeuPmDndq5p86vu+fSm7w/84/+tUZIlGeTLLHgSZ2/n0mZVaH33vVEN74Z0qwrUro7l7MZfrX/rC3///zaMQZM8QWyvfDeAC2t4pvPvf//7/3n///Hzn/GPreNVpiW0GsK00DE19a+9Yrr5pTM+77zHxSHFbIcdxcqaxE01ZtbX9swaRszZcG99CpWeTdWOBqMwOC4TFHm2uW6oVcOPDLEEnZ04fpIDIVS5Og80sq3ypREr9TxlYqmSkV0zKFgYmlzuqL4zGi3VFI0yENylZlIyLTi8gKl89nhq+KuoERdq6dRj8MhbJQaynN9TqViOteitbNpcLDGwOSGKdmcWd5iJd2faRotOzGuw8HmM6GpQtsdP/zaMQVIGuy7xxhhN75u6emDRdsoQlpEdNSZEKjTNVoxBynJSvah2t2QNInmpWVkLzs/t37Oz730Ds0PXVl3Jeur9DIJ0Q0cMx0T2/kaqZ19WtKwNSqZEKCC0dqNcPIPJeRr98Trp0Vt10RbrdNlb93IyhzAWUm/eld2ju/21raQbGoedruSaRuUCnHmcjIyqt8+xMxhYSESxiOFZlnagtVtROE5YvSSten7B3tNtpBK0L9ejRfPtbOXrK6bXYaFIiTVdsP/+6v+zsbxjDqqBs/qU73Myq9bv/zaMReI3Nm4lx6Rt+zbOVyUjrlyDjCyfM0Iz1cSR7f/e6/85c9QeTevAeW8G3K3ed+/yeJDxPKvvP6x22Y3Me5qJ+QAL5cpPwdxLQGD4xGWzYFANQ6EycUjK4sJiMsVxVsIOF7h0FmIw+y6Ryu3DcXpZpSguwnW1BEkw+a4XAsAgrVfNtulGC5S6zMKp0VZwaqM3087jO053OLRX/+307olbFdmr9jO+UspkFGMbWhukvaYyhRSwICckzTGNavQz5TGf5qPmoGKGu74iHqTEFNRTMuMTAwqv/zaMSbJBtmyxziRQiqqqqqqqqqqqqqqqpmZEi4uSAgF/83nQ1HAGQcGmBoSiYeIGEgDuxcKBBiAgFoM1V6IG8xcEDgJ+6dY7WG5tq/wkYasQIMo4ECcnXFXEkoKWuxwzKNShL/+hpVPFDgzCAYwNpgql+T+/cquJ4VmVyt6L9EKpeqsyP6aGfVDOolZDiWiXdqeKnwSk4+uVS4BJ7GMMI4rZTVTEFNRTMuMTAwVVVVVVVVVVVVVVVVgAACanAFJ3CV2viZAUbS5rAsojDDAaCgQOFw1xkamf/zaMS/IorSuv7YxPDMOzMUWGfotrYvJI0jq4dXWLjEmq1s6cEoSvRe+ytUwUgq8dR9dIULWNS8k1l+hZ/uBUY0NIqgNg9rrVL6hGmS3dBxAxjUFRgehUPRFYORFIELuYZTUiZxa7eXUVGjXdyTjIiouZeoohkVVGh6tUkIjvzXVQ7z3c0jvLWvL0kQgoQkaIiVS9Uv9Chh/CGINn7O4nyjcztaTEFNRTMuMTAwqqqqqqqqqqrAASq2Egt2qAZDKhSOBKxJboS3JAinQIUgYYqKkZ8KDloKe//zaMTqLWv6llbLENGmI6iSIZoMbn42GCVv1jJI5lPYkZL3K3FG+Hu00e1X89+1OUKr7ckCZ9tRcsKC4nSylEGTmjbRmLpd6w2G0m0ZGUFE3yuCmbsdtfzr3G+zLde0p9jJtFDakkgQMvh5ChzbLCtwz5t05XxnG4XdzgnuNGxUgJ22FRQkJCE2s97Ecq3XG7RIVUKsJVLZf9VWPbJ70PgTx3wKPIFz6WVREwk5MpkhI6aa18hCjPFY5riO/WjGQR53EayzVb3/vtsBQNLkMFEYlciPRJipKf/zaMTuLmPOnb42Eg3JOTbVi0/FVBUtLtKyTckxrnKzpOEV3d+bJF0nxIkUkR1CQBpoDgNNIQqtDRTStp8WloFqM8fBLxaFYPUdhcyCj5NsFUPMIMrSBG+YaXNCHMhb1WzwYjVFq4ZrubMHEV5XUkf1vW7VAarutHYqnM/1Q7Z48Xvd2vuDd1uf78K2M6vSfxd2paT+uq/xriz7NV93/rt3hMUKRf9ApPdeukxBTUUzLjEwMKqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqv/zaMT/NGNOfDDaXt2qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqoCCnrv/qnLr6U6QVAygQIwOEYpNoxB5E7f8h1T5NmacJRKt3wJNgBSyS3NVTTJ1omxFpNvOre95nWaumpjIaxIkX6ug4VFUdqsq1U6loxRhqK3/ZH7IjSslBYofAUcW8oi9W2QpnM9fbm6lsYWeErBigmsqo8SXnXyVAUOiWKqRTEFNRVVVIakMqpbILEJAPGNjpjRCZCQAY0MvbTZDczuEBzEYacDRUhQFwJ9nbajFHFIPG//zaMS0H/sy0l5Jip59vESRO1uxam0izVrDulr+HzdVS5ye76jD12Z57nbVMcS9ZKPLItKi6ToanDDC5M5pAfIzJwwVk3yExhsjyOkkRJ0wLEnSZDigl9VME6Zoyh8kasKd77stpGbtyq7cqrSuO1K8Zy+qguyCZCFKqRmtI9rWL9uOc5pvOt517bw++Pn6vX3ircDQo+j1aSrmkhh4VEUkRSLKTEFNRTMuMTAwqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqv/zaMT5MRsyYFDZntyqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqsGlmu2rlt0IVWU1RSSqhUUhkUoUOSCpLlPN+hgIxkMZW/Qz/MZ8xn/LlL6tR1/7yl1+gEKKWGFPleBOFEt///9kebmNRwowGhLOrcVIhsRRKqSEQd8SlTpH+WflnzvLVRKMqTEFNRTMuMTAwqqqqqqqqqqqqqqqqqqqqqiRZBILnfAwIqBJcCCCDoWMee2T2t1KeTRaGYv/zaMSXGJKGnb4KRBZR2AYKg2AJXSU9i2dn//LFgMEDBgqlllsqf///2SzIyl862zjMKFBhQYTCukikjeME+EmkjtJkMkToLUJyNAmZdDPHsNkU4MYQUoC/oYo1Yxrhea4EOmZJZG5rkh0vTOMZxvX/////9a4zTeo0GFClgXiT1tnP1asKDJLLJLQMqrhiQYlTDVBq6ZXbKyoiX///3/+FVVVVTEFNRTMuMTAwVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVf/zaMTmLHsFXZzA3vVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVQ==', 'status': 2, 'ced': '36'}} ws is closed ### 链接关闭 ###转换后文件:➜ xunfei-ttp ls demo.mp3 iflytek-tts.py requirements.txt除了讯飞小燕,免费版也可以选择其他朗读者:讯飞小燕 xiaoyan aisjiuxu aisxping aisjinger 讯飞许小宝 aisbabyxu个性化定制上,免费版有一定的限制,这一点不如微软Azure。结语仅就免费版本体现出的产品力来看,讯飞平台较微软Azure还是略逊一筹,但其基于Websocket的流式接口架构,确是颇有足以借镜之处,随着国内AI 技术的不断发展,差距在逐渐缩小。最后,奉上完整项目地址,与众亲同飨:https://github.com/zcxey2911/xunfei-tts

以寡治众各个击破,超大文件分片上传之构建基于Vue.js3.0+Ant-desgin+Tornado6纯异步IO高效写入服务

分治算法是一种很古老但很务实的方法。本意即使将一个较大的整体打碎分成小的局部,这样每个小的局部都不足以对抗大的整体。战国时期,秦国破坏合纵的连横即是一种分而治之的手段;十九世纪,比利时殖民者占领卢旺达, 将卢旺达的种族分为胡图族与图西族,以图进行分裂控制,莫不如是。21世纪,人们往往会在Leetcode平台上刷分治算法题,但事实上,从工业角度上来看,算法如果不和实际业务场景相结合,算法就永远是虚无缥缈的存在,它只会出现在开发者的某一次不经意的面试中,而真实的算法,并不是虚空的,它应该能帮助我们解决实际问题,是的,它应该落地成为实体。大文件分片上传就是这样一个契合分治算法的场景,现而今,视频文件的体积越来越大,高清视频体积大概2-4g不等,但4K视频的分辨率是标准高清的四倍,需要四倍的存储空间——只需两到三分钟的未压缩4K 电影,或者电影预告片的长度,就可以达到500GB。 8K视频文件更是大得难以想象,而现在12K正在出现,如此巨大的文件,该怎样设计一套合理的数据传输方案?这里我们以前后端分离项目为例,前端使用Vue.js3.0配合ui库Ant-desgin,后端采用并发异步框架Tornado实现大文件的分片无阻塞传输与异步IO写入服务。前端分片首先,安装Vue3.0以上版本:npm install -g @vue/cli安装异步请求库axios:npm install axios --save随后,安装Ant-desgin:npm i --save ant-design-vue@next -SAnt-desgin虽然因为曾经的圣诞节“彩蛋门”事件而声名狼藉,但客观地说,它依然是业界不可多得的优秀UI框架之一。接着在项目程序入口文件引入使用:import { createApp } from 'vue' import App from './App.vue' import { router } from './router/index' import axios from 'axios' import qs from 'qs' import Antd from 'ant-design-vue'; import 'ant-design-vue/dist/antd.css'; const app = createApp(App) app.config.globalProperties.axios = axios; app.config.globalProperties.upload_dir = "https://localhost/static/"; app.config.globalProperties.weburl = "http://localhost:8000"; app.use(router); app.use(Antd); app.mount('#app')随后,参照Ant-desgin官方文档:https://antdv.com/components/overview-cn 构建上传控件:<a-upload @change="fileupload" :before-upload="beforeUpload" <a-button> <upload-outlined></upload-outlined> </a-button> </a-upload>注意这里需要将绑定的before-upload强制返回false,设置为手动上传:beforeUpload:function(file){ return false; }接着声明分片方法:fileupload:function(file){ var size = file.file.size;//总大小 var shardSize = 200 * 1024; //分片大小 this.shardCount = Math.ceil(size / shardSize); //总片数 console.log(this.shardCount); for (var i = 0; i < this.shardCount; ++i) { //计算每一片的起始与结束位置 var start = i * shardSize; var end = Math.min(size, start + shardSize); var tinyfile = file.file.slice(start, end); let data = new FormData(); data.append('file', tinyfile); data.append('count',i); data.append('filename',file.file.name); const axiosInstance = this.axios.create({withCredentials: false}); axiosInstance({ method: 'POST', url:'http://localhost:8000/upload/', //上传地址 data:data }).then(data =>{ this.finished += 1; console.log(this.finished); if(this.finished == this.shardCount){ this.mergeupload(file.file.name); }).catch(function(err) { //上传失败 }具体分片逻辑是,大文件总体积按照单片体积的大小做除法并向上取整,获取到文件的分片个数,这里为了测试方便,将单片体积设置为200kb,可以随时做修改。随后,分片过程中使用Math.min方法计算每一片的起始和结束位置,再通过slice方法进行切片操作,最后将分片的下标、文件名、以及分片本体异步发送到后台。当所有的分片请求都发送完毕后,封装分片合并方法,请求后端发起合并分片操作:mergeupload:function(filename){ this.myaxios(this.weburl+"/upload/","put",{"filename":filename}).then(data =>{ console.log(data); }至此,前端分片逻辑就完成了。后端异步IO写入为了避免同步写入引起的阻塞,安装aiofiles库:pip3 install aiofilesaiofiles用于处理asyncio应用程序中的本地磁盘文件,配合Tornado的异步非阻塞机制,可以有效的提升文件写入效率:import aiofiles # 分片上传 class SliceUploadHandler(BaseHandler): async def post(self): file = self.request.files["file"][0] filename = self.get_argument("filename") count = self.get_argument("count") filename = '%s_%s' % (filename,count) # 构成该分片唯一标识符 contents = file['body'] #异步读取文件 async with aiofiles.open('./static/uploads/%s' % filename, "wb") as f: await f.write(contents) return {"filename": file.filename,"errcode":0}这里后端获取到分片实体、文件名、以及分片标识后,将分片文件以文件名\_分片标识的格式异步写入到系统目录中,以一张378kb大小的png图片为例,分片文件应该顺序为200kb和178kb,如图所示:当分片文件都写入成功后,触发分片合并接口:import aiofiles # 分片上传 class SliceUploadHandler(BaseHandler): async def post(self): file = self.request.files["file"][0] filename = self.get_argument("filename") count = self.get_argument("count") filename = '%s_%s' % (filename,count) # 构成该分片唯一标识符 contents = file['body'] #异步读取文件 async with aiofiles.open('./static/uploads/%s' % filename, "wb") as f: await f.write(contents) return {"filename": file.filename,"errcode":0} async def put(self): filename = self.get_argument("filename") chunk = 0 async with aiofiles.open('./static/uploads/%s' % filename,'ab') as target_file: while True: source_file = open('./static/uploads/%s_%s' % (filename,chunk), 'rb') await target_file.write(source_file.read()) source_file.close() except Exception as e: print(str(e)) break chunk = chunk + 1 self.finish({"msg":"ok","errcode":0})这里通过文件名进行寻址,随后遍历合并,注意句柄写入模式为增量字节码写入,否则会逐层将分片文件覆盖,同时也兼具了断点续写的功能。有些逻辑会将分片个数传入后端,让后端判断分片合并个数,其实并不需要,因为如果寻址失败,会自动抛出异常并且跳出循环,从而节约了一个参数的带宽占用。轮询服务在真实的超大文件传输场景中,由于网络或者其他因素,很可能导致分片任务中断,此时就需要通过降级快速响应,返回托底数据,避免用户的长时间等待,这里我们使用基于Tornado的Apscheduler库来调度分片任务:pip install apscheduler随后编写job.py轮询服务文件:from datetime import datetime from tornado.ioloop import IOLoop, PeriodicCallback from tornado.web import RequestHandler, Application from apscheduler.schedulers.tornado import TornadoScheduler scheduler = None job_ids = [] # 初始化 def init_scheduler(): global scheduler scheduler = TornadoScheduler() scheduler.start() print('[Scheduler Init]APScheduler has been started') # 要执行的定时任务在这里 def task1(options): print('{} [APScheduler][Task]-{}'.format(datetime.now().strftime('%Y-%m-%d %H:%M:%S.%f'), options)) class MainHandler(RequestHandler): def get(self): self.write('<a href="/scheduler?job_id=1&action=add">add job</a><br><a href="/scheduler?job_id=1&action=remove">remove job</a>') class SchedulerHandler(RequestHandler): def get(self): global job_ids job_id = self.get_query_argument('job_id', None) action = self.get_query_argument('action', None) if job_id: # add if 'add' == action: if job_id not in job_ids: job_ids.append(job_id) scheduler.add_job(task1, 'interval', seconds=3, id=job_id, args=(job_id,)) self.write('[TASK ADDED] - {}'.format(job_id)) else: self.write('[TASK EXISTS] - {}'.format(job_id)) # remove elif 'remove' == action: if job_id in job_ids: scheduler.remove_job(job_id) job_ids.remove(job_id) self.write('[TASK REMOVED] - {}'.format(job_id)) else: self.write('[TASK NOT FOUND] - {}'.format(job_id)) else: self.write('[INVALID PARAMS] INVALID job_id or action') if __name__ == "__main__": routes = [ (r"/", MainHandler), (r"/scheduler/?", SchedulerHandler), init_scheduler() app = Application(routes, debug=True) app.listen(8888) IOLoop.current().start()每一次分片接口被调用后,就建立定时任务对分片文件进行监测,如果分片成功就删除分片文件,同时删除任务,否则就启用降级预案。结语分治法对超大文件进行分片切割,同时并发异步发送,可以提高传输效率,降低传输时间,和之前的一篇:聚是一团火散作满天星,前端Vue.js+elementUI结合后端FastAPI实现大文件分片上传,逻辑上有异曲同工之妙,但手法上却略有不同,确是颇有相互借镜之处,最后代码开源于Github:https://github.com/zcxey2911/Tornado6\_Vuejs3\_Edu,与众亲同飨。

天人合一物我相融,站点升级渐进式Web应用PWA(Progressive Web Apps)实践

PWA(Progressive web apps,渐进式 Web 应用)使用现代的 Web API 以及传统的渐进式增强策略来创建跨平台 Web 应用程序,说白了,PWA可以让我们的站点以原生APP的形式运行,但相比于安装原生APP应用,访问PWA显然更加容易和迅速,还可以通过链接来分享PWA应用。有许多知名的网络平台已经将 PWA 方案落地,比如Twitter。选择增强的网站体验而不是原生应用。事实上使用PWA也确实从中获得了显而易见的益处。https://www.pwastats.com 这个网站上分享了许多案例研究,PWA相比于传统应用有以下好处:1、减少应用安装后的加载时间,通过 Service Workers 来进行缓存,以此来节省带宽和时间。2、当应用有可用的更新时,可以只更新发生改变的那部分内容。相比之下,对于一个原生应用而言,即便是最微小的改动也需要强制用户去进行热更新或者再次下载整个应用。 3、外观和使用感受与原生平台更加融为一体——应用图标被放置在主屏幕上,应用可以全屏运行等。 凭借系统通知和推送消息与用户保持连接,对用户产生更多的吸引力,并且提高转换效率。诚然,从零开始研发PWA应用会有一定的成本,但如果我们本身就拥有基于Web的站点,那么就可以通过增加对应的配置文件和服务进行升级操作,直接拥有PWA应用。HTTPS服务首先PWA要求站点的请求方式为HTTPS,如果是生产环境,可以通过为Nginx服务器配置SSL的方式进行适配,但是线下环境测试PWA时就有点费劲了,所以通过openssl工具为本地域名localhost做自签证书:openssl req -x509 -out localhost.crt -keyout localhost.key \ -newkey rsa:2048 -nodes -sha256 \ -days 3650 \ -subj '/CN=localhost' -extensions EXT -config <( \ printf "[dn]\nCN=localhost\n[req]\ndistinguished_name = dn\n[EXT]\nsubjectAltName=DNS:localhost\nkeyUsage=digitalSignature\nextendedKeyUsage=serverAuth")产出:localhost.crt和localhost.key文件,key是私用密钥openssl格式,通常是rsa算法。csr是证书请求文件,用于申请证书,在制作csr文件的时,必须使用自己的私钥来签署申,还可以设定一个密钥。将文件放到项目的根目录下,随后在构建项目服务的时候配置即可,以Tornado为例:server = httpserver.HTTPServer(app,xheaders=True,ssl_options={ "certfile": "./localhost.crt", "keyfile": "./localhost.key", # 指定端口 server.listen(443)这里通过设置ssl\_options参数来导入私钥和证书,同时将端口改为HTTPS默认端口号443。如此,在本地也可以对PWA进行测试了,当然了,如果不需要本地操作,也可以跳过这步。manifest.json配置文件为了实现 PWA 应用添加至桌面的功能,除了要求站点支持 HTTPS 之外,还需要准备 manifest.json 文件去配置应用的图标、名称等信息。以本站为例,在站点根目录创建manifest.json文件:{ "name": "刘悦的技术博客", "short_name": "刘悦的技术博客", "description": "刘悦的技术博客", "icons": [ "src": "https://v3u.cn/v3u/Public/images/pwa192.png", "sizes": "192x192", "type": "image/png" "src": "https://v3u.cn/v3u/Public/images/pwa512.png", "sizes": "512x512", "type": "image/png" "background_color": "#FFF", "theme_color": "#FFF", "display": "standalone", "orientation": "portrait", "start_url": "/", "scope": "/" }由上至下,依次是 PWA 应用的名称、描述、图标文件、banner颜色、显示方式、开始页面的链接和 PWA 的作用域。为此我们需要提供两张不同分辨率的站点图标文件:ServiceWorker服务Service Worker是一个注册在指定源和路径下的事件驱动型Web Worker。它充当了Web应用程序与浏览器之间的代理服务器,进行资源在文件级别下的缓存与操控,拦截页面请求,实现在不同的情况下对不同请求的响应策略。Service Worker本质上就是一个Web Worker,因此它具有Web Worker的特点:无法操作DOM、脱离主线程、独立上下文。Service Worker还具有这些特点:只能在Https下使用、运行在浏览器后台,不受页面刷新影响、更强大的离线缓存能力(使用Cache API)、请求拦截能力、完全异步,不能使用同步API、持续运行,第一次访问页面后,Service Worker就会安装激活并持续运行,直到手动销毁。以本站为例,在站点根目录创建sw.js文件,注意Service Worker文件位置一定得在根目录,如果不在根目录也要通过重写或者url映射让其可以通过根目录路径进行访问,如:https://v3u.cn/sw.js,否则浏览器会检测不到Service Worker服务:var CACHE_NAME = 'v3u-cache-v1'; var urlsToCache = [ '/v3u/Public/css/tidy_min.css' self.addEventListener('install', function (event) { event.waitUntil( caches.open(CACHE_NAME).then(function (cache) { console.log('Open cache'); return cache.addAll(urlsToCache); }).then(function () { self.skipWaiting(); }); 当我们为页面注册Service Worker后,Service Worker开始进行安装,安装成功之后,会在worker中触发install事件;如果安装失败,则进入废弃状态。如果Service Worker逻辑文件更新(相关资源文件变动或者内部逻辑更新等),Service Worker会重新安装,如果这个时候,页面依然存在激活状态下的worker(旧的Service Worker),那么新的worker会进入waiting状态进行等待,直到我们主动去操作worker强制其更新,或者等待用户关闭所有页面,这个时候新的worker才会进入到激活状态。在install事件中,我们使用caches.open方法打开cache对象,并通过cache.addAll缓存所有我们列出的文件。如果Service Worker存在更新,我们使用skipWaiting跳过等待,直接强制新的worker进入激活状态。随后,添加fetch事件:self.addEventListener('fetch', function(event){ if(event.request.method !== 'GET') return; event.respondWith( caches.match(event.request).then(function(response){ if(response){ console.log('return caches'); return response; }else{ return fetch(event.request).catch(function(){ if(/\.html$/.test(event.request.url)) return caches.match('/html/neterror.html'); });这里只监听了全站的GET请求方式,即我们只希望控制资源请求。通过caches.match检查请求是否命中了缓存,如果命中,则直接返回缓存给用户,防止重复请求,节约资源。如果没有命中,则将使用fetch方法请求网络资源并返回给用户。当网络状态异常时(fetch().catch()),返回404页面的缓存给用户,告知用户当前处于无网络状态,不能访问相关页面。指定了一些页面和文件进行缓存,我们希望用户在无网络的情况下只能访问到我们指定缓存的页面。当然,还有另外一种情况,我们指定了一些页面进行缓存(常用页面),当用户访问到一些不常用页面时,再对其进行缓存。这样,我们可以对资源配置进行优化,不过多的占用用户本地资源去缓存所有页面,因为PWA的缓冲本身是存储到客户端的,对于非所有用户的常用页面,按需缓存:self.addEventListener('fetch', function(event){ if(event.request.method !== 'GET') return; event.respondWith( caches.match(event.request).then(function(response){ if(response){ console.log('return caches'); return response; }else{ return fetch(event.request).then(function(res){ var responseToCache = res.clone(); caches.open(CACHE_NAME).then(function(cache){ catch.put(event.request, responseToCache); return res; });至此,ServiceWorker服务文件就撰写完成了。生产环境上线配置:分别将manifest.json和sw.js文件分别上传到生产环境之后,在页面的head标签中进行声明:<link rel="manifest" href="manifest.json">声明后,注意访问一下是否正确返回:https://v3u.cn/manifest.json随后在页面中注册Service Worker服务:<script> if ('serviceWorker' in navigator) { window.addEventListener('load', () => navigator.serviceWorker.register("/sw.js?v0") .catch(() => {}) // ignore </script>这里首先判断当前浏览器的navigator是否支持serviceWorker,随后使用navigator.serviceWorker.register函数来注册Service Worker。其中,参数为要执行的worker逻辑文件路径,注意这个路径是基于origin的,而非当前文件。接着键入组合键,打开chrome浏览器的开发者工具:Mac系统上的“⌥+⌘+I”Win系统上的“F12+Ctrl+Shift+I”在Chrome 的应用标签下进行检查,看应用清单有没有读出你的 PWA 应用信息配置文件:随后在serviceWorker标签下检查serviceWorker是否正确运行:接着访问站点,在地址栏即可添加PWA应用:访问效果:结语渐进式增强和响应式设计已经可以让我们构建对移动端非常友好的站点,而PWA则又在我们的身后轻轻地推了一把,黄河之水源可滥觞,星星之火正在燎原,一年以内,我们都将感到PWA的灼人温度。

当我们进行性能优化,我们在优化什么(LightHouse优化实操)

好的互联网产品不仅仅在功能上要高人一筹,在性能层面也需要出类拔萃,否则金玉其外败絮其中,页面是美轮美奂了,结果首屏半天加载不出来,难免让用户乘兴而来,败兴而归。幸运的是,前端的性能优化有诸多有迹可循的理论和方法,其中相对权威的,无疑是LightHouse。LightHouse 是一个开源的自动化工具,它作为 Chrome 浏览器的扩展程序运行,提供一套完整的站点评分标准,我们可以依据此标准对站点进行基准测试,从而达到优化的效果。怎么打开LightHouse?可以在Chrome浏览器开发人员工具中找到LightHouse。要打开“开发人员工具”,请选择: “顶部菜单→查看→开发人员→开发人员工具” 或者使用快捷键:Mac系统上的“⌥+⌘+I” Win系统上的“F12+Ctrl+Shift+I”。 随后点击生成报告按钮即可:LightHouse评分大体上有四大指标,分别为:性能、无障碍、最佳做法以及SEO。性能指标(Performance)性能指标里又分为六个小指标:Largest Contentful Paint 【简称LCP: 最大内容渲染】 FCP最大内容渲染时间标记了渲染出最大文本或图片的时间。 Total Blocking Time 【简称TBT: 总阻塞时间】 TBT测量了FCP(首次内容渲染)和TTI(可交互时间)之间的总耗时。TTI可能会被主线程阻塞以至于无法及时响应用户。大于50ms的任务称为长任务,当任意长任务出现时,主线程则称为被阻塞状态。由于浏览器不会打断正在进行中的长任务,所以,如果用户在执行长任务时和页面有交互事件时,浏览器必须等到该长任务完成才能响应。TBT计算的是在FCP到TTI之间所有长任务时间内总和。 First Contentful Paint 【简称FCP: 首次内容渲染】FCP测量了从页面开始加载到页面任意部分的内容渲染到屏幕上。Speed Index 【简称SI: 速度指数】 SI速度指数表明了网页内容的可见填充速度。lighthouse首先捕获页面加载的视屏,然后对比帧与帧之间视觉效果变化(通过计算结构相似指数SSMI来比较)。 Time to Interactive 【简称TTI: 可交互时间】 可交互时间是指网页需要多长时间才能提供完整交互功能。TTI测量了从页面开始加载到页面的主要附属资源加载完毕,并且可以足够快速回应用户输入的所用时间。 Cumulative Layout Shift 【简称CLS: 累积布局偏移】 CLS累积布局偏移旨在测量可见元素在视口内的移动情况。CLS值越小越好。性能优化手段有哪些手段可以提高这些性能指标?首先需要优化的是页面“资源”,这里的资源指的是页面中加载的一切元素,包含但不限于:js文件、css文件、图片、视频等。对于js文件来说,首先要做的是业务分拆,不同页面只加载对应需要的文件,并且做到单页面只加载一个js文件,减少Http请求数,多余的文件要做合并压缩操作,但其实这里有一个基础问题,就是如果js文件本身就很庞大,压缩比例再高,也是杯水车薪,举个例子,一般情况下Jquery官方的压缩版就已经高达80kb左右了,这样的体积很难有再次压缩的优化空间,所以还不如直接摒弃Jquery,换成别的功能上可以替换的库,比如zepto,后者的体积只有26kb,是前者的四分之一。随后进行压缩合并操作,首先安装:npm install uglify-js -g以本站为例,业务上用到的js库分别为zepto.min.js、my.js、lazyload.min.js、wordcloud2.min.js iconfont.js,将这五个js文件进行合并压缩:uglifyjs zepto.min.js my.js lazyload.min.js wordcloud2.min.js iconfont.js -o ./1-min.js如此,最后得到一个体积为59kb的1-min.js文件,当然这是业务层面的压缩,还可以通过修改服务器进行gzip压缩:location ~ .*.(jpg|gif|png|bmp|js|css)$ { gzip on; gzip_http_version 1.1; gzip_comp_level 3; gzip_types text/plain application/json application/x-javascript application/css application/xml application/xml+rss text/javascript application/x-httpd-php image/jpeg image/gif image/png image/x-ms-bmp; }加载方式上,尽量使用预加载:<link rel="preload" as="script" href="1-min.js" />同时,对于一些站外js比如广告,或者一些js特效,我们可以对其进行延时加载的操作,即首屏加载好之后,再加载这些逻辑:<script nonce="EDNnf03nceIOfn39fn3e9h3sdfa"> (function() { var done = false; var script = document.createElement('script'); script.async = true; script.type = 'text/javascript'; script.src = '//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js'; var createScript = setTimeout( function(){ document.getElementsByTagName('HEAD').item(0).appendChild(script); WordCloud(canvas, options); }, 7000 script.onreadystatechange = script.onload = function(e) { if (!done && (!this.readyState || this.readyState == 'loaded' || this.readyState == 'complete')) { (adsbygoogle = window.adsbygoogle || []).push({}); })(); </script>上面的逻辑就是首屏完成7秒后再加载Google广告和标签云特效。对于css文件的处理,原理和js文件差不多,宗旨也是分拆,缩小体积,并且压缩:cssMinifier(['./bootstrap.min.css', '../js/kindeditor/plugins/code/prettify_dark.css', './style.css'], './tidy_min.css');优化后,得到体积为17kb的tidy\_min.css文件。对于图片文件,不仅是首图,所有图片最好都采用新的图片格式Webp,用以减少其体积,具体操作方法请移步:石火电光追风逐日|前端优化之次时代图片压缩格式WebP的项目级躬身实践(Python3 PIL+Nginx)。对于特定的图片,比如Logo,使用svg格式图片,请移步:Logo小变动,心境大不同,SVG矢量动画格式网站Logo图片制作与实践教程(Python3)同时,对于图片一律声明宽高属性,并且使用支持lazyload.js组件推迟对屏幕外图片的加载。使用viewport标签加快移动端的载入速度:<meta name="viewport" content="width=device-width, initial-scale=1"> <meta name="applicable-device" content="pc,mobile"/>无障碍(Accessibility)访问无障碍检测所有用户是否都能有效地访问内容并浏览网站,无障碍性的每个指标项测试结果为pass或者fail,与性能指标项的计算方式不同,当页面只是部分通过某项指标时,页面的这项指标将不会得分。例如,如果页面中的一些元素有屏幕阅读器友好的命名,而其他的元素没有,那么这个页面的 screenreader-friendly-names 指标项得分为0。一般情况下,优化无障碍其实是对于站点标签的优化,比如页面元素是否具备title标签、title元素是否按降序排列、是否声明了页面语言类型、元素是否具备alt标签等等,值得一提的是,页面对比度也是无障碍评分重要的一环,假如背景色是white,那么前景色最好选择高对比度的颜色,比如black。最佳做法(Best Practice)最佳做法检测可以改善网页的代码健康状况的一些最佳做法,评分的分值由相关指标的加权平均值计算而来。最佳做法指标我们可以理解为就是站点安全性的指标,多数情况下,需要保证协议为HTTPS,同时要开启CSP网页安全政策防止XSS攻击。CSP 的实质就是白名单制度,开发者明确告诉客户端,哪些外部资源可以加载和执行,等同于提供白名单。它的实现和执行全部由浏览器完成,开发者只需提供配置。CSP 大大增强了网页的安全性。攻击者即使发现了漏洞,也没法注入脚本,除非还控制了一台列入了白名单的可信主机。开启方法:<meta http-equiv="Content-Security-Policy" content="script-src 'self'; object-src 'none'; style-src cdn.example.org third-party.org; child-src https:">搜索引擎优化(SEO)搜索引擎优化检测搜索引擎对网页内容的理解程度是怎样的,评分的分值由相关指标的加权平均值计算而来。说白了,就是站点页面是否适合搜素引擎蜘蛛的抓取以及收录,以本站为例,搜索引擎需要的标签如下:<head> <meta http-equiv="Content-Type" content="text/html;charset=utf-8"> <meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta name="viewport" content="width=device-width, initial-scale=1"> <meta name="applicable-device" content="pc,mobile"/> <title>当我们进行性能优化,我们在优化什么(LightHouse优化实操)-刘悦</title> <meta name="description" content="好的互联网产品不仅仅在功能上要高人一筹,在性能层面也需要出类拔萃,否则金玉其外败絮其中,页面是美轮美奂了,结果首屏半天加载不出来,难免让用户乘兴而来,败兴而归。幸运的是,前端的性能优化有诸多有迹可循的理论和方法,其中相对权威的,无疑是LightHouse。LightHouse是一个开源的自动化工具,它作为Chrome浏览器的扩展程序运行,提供一套完整的站点评分标准,我们可以依"> <meta content="刘悦" name="Author"> <link rel="canonical" href="https://v3u.cn/a_id_214"/> <link rel="miphtml" href="https://v3u.cn/mipa_id_214"> <link rel="stylesheet" href="/v3u/Public/css/tidy_min.css?v=11"/> <link rel="shortcut icon" href="favicon.ico" type="image/x-icon"/> <link rel="icon" href="favicon.ico" type="image/x-icon"/> <link rel="stylesheet" href="/v3u/Public/css/share.min.css?v=1"> </head>包括页面标题、描述、作者、页面唯一标识等等元素。当我们完成上面这些指标的优化之后,就可以,坐下来,欣赏这紫禁烟花一万重了:正是:东风夜放花千树,更吹落,星如雨。结语前端的性能分析和优化方式,无论是传统性能还是感官性能完全可以根据LightHouse按图索骥。过程中可以针对某些指标进行一定的取舍,虽然本站在LightHouse的优化实践中取得了一定的效果,但路漫漫其修远兮,吾将上下而求索。

青山不遮,毕竟东流,集成Web3.0身份钱包MetaMask以太坊一键登录(Tornado6+Vue.js3)

上世纪九十年代,海湾战争的时候,一位美军军官担心他们的五角大楼会被敌人的一枚导弹干掉,从而导致在全球的美军基地处于瘫痪状态。这时候,有一位天才的科学家说,最好的中心就是没有中心。是的,这就是最朴素的去中心化思想,于是互联网出现了。一个没有互联网的时代是无法想象的,互联网的核心就是把一个信息分成若干的小件,用不同的途径传播出去,怎么方便怎么走。三十年后的今天,去中心化身份逐渐被广泛采用。用户的部分在线活动在链上是公开的,可通过加密钱包搜索到,用户在链上创造、贡献、赚取和拥有的东西,都反映了他们的喜好,也逐渐积累成该用户的身份和标识。当我们的用户厌倦了传统的电子邮件/密码注册流程时,他们会选择Google、GitHub等社交登录方式,这种方式虽然节约了用户的时间,但登录信息也会被第三方平台记录,也就是说我们用平台账号做了什么,平台都会一目了然,甚至还会对我们的行为进行分析、画像。那么有没有一种登录方式,它的所有信息都只保存在客户端和后端,并不牵扯三方平台授权,最大化的保证用户隐私呢?Web3.0给我们提供了一种选择:MetaMask。MetaMaskMetaMask是用于与以太坊区块链进行交互的软件加密货币钱包。MetaMask允许用户通过浏览器插件或移动应用程序访问其以太坊钱包,然后可以使用这些扩展程序与去中心化应用程序进行交互。当然了,首先需要拥有一个MetaMask钱包,进入https://chrome.google.com/webstore/detail/metamask/nkbihfbeogaeaoehlefnkodbefgpgknn安装metamask浏览器插件:随后点开插件,创建账号,记录密码、钱包地址、以及助记词等信息。安装好插件之后,我们就可以利用这个插件和网站应用做交互了。钱包登录流程登录逻辑和传统的三方登录还是有差异的,传统三方登录一般是首先跳转三方平台进行授权操作,随后三方平台将code验证码返回给登录平台,登录平台再使用code请求三方平台换取token,再通过token请求用户账号信息,而钱包登录则是先在前端通过Web3.js浏览器插件中保存的私钥对钱包地址进行签名操作,随后将签名和钱包地址发送到后端,后端利用Web3的库用同样的算法进行验签操作,如果验签通过,则将钱包信息存入token,并且返回给前端。前端签名操作首先需要下载前端的Web3.0操作库,https://docs.ethers.io/v4/,随后集成到登录页面中:<script src="{{ static_url("js/ethers-v4.min.js") }}"></script> <script src="{{ static_url("js/axios.js") }}"></script> <script src="{{ static_url("js/vue.js") }}"></script>这里我们基于Vue.js配合Axios使用。接着声明登录激活方法:sign_w3:function(){ that = this; ethereum.enable().then(function () { this.provider = new ethers.providers.Web3Provider(web3.currentProvider); this.provider.getNetwork().then(function (result) { if (result['chainId'] != 1) { console.log("Switch to Mainnet!") } else { // okay, confirmed we're on mainnet this.provider.listAccounts().then(function (result) { console.log(result); this.accountAddress = result[0]; // figure out the user's Eth address this.provider.getBalance(String(result[0])).then(function (balance) { var myBalance = (balance / ethers.constants.WeiPerEther).toFixed(4); console.log("Your Balance: " + myBalance); // get a signer object so we can do things that need signing this.signer = provider.getSigner(); var rightnow = (Date.now()/1000).toFixed(0) var sortanow = rightnow-(rightnow%600) this.signer.signMessage("Signing in to "+document.domain+" at "+sortanow, accountAddress, "test password!") .then((signature) => { that.handleAuth(accountAddress,signature); console.log(this.signer); },通过使用signMessage方法返回签名,这里加签过程中使用基于时间戳的随机数防止未签名,当前端签名生成好之后,立刻异步请求后台接口://检查验证 handleAuth:function(accountAddress, signature){ this.myaxios("/checkw3/","post",{"public_address":accountAddress,"signature":signature}).then(data =>{ if(data.errcode==0){ alert("欢迎:"+data.public_address); localStorage.setItem("token",data.token); localStorage.setItem("email",data.public_address); window.location.href = "/"; }else{ alert("验证失败"); }这里将当前账户的钱包地址和签名传递给后端,如图所示:完整页面代码:<!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <title>Edu</title> <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no, viewport-fit=cover"> <link rel="stylesheet" href="{{ static_url("css/min.css") }}" > <link rel="icon" href="/static/img/favicon.68cbf4197b0c.png"> <script src="{{ static_url("js/ethers-v4.min.js") }}"></script> <script src="{{ static_url("js/axios.js") }}"></script> <script src="{{ static_url("js/vue.js") }}"></script> </head> <body> <div> {% include "head.html" %} <div id="app" class="container main-content"> <div class="row justify-content-center"> <div class="col-md-10 col-lg-8 article"> <div class="article-body page-body mx-auto" style="max-width: 400px;"> <h1 class="text-center mb-4">Sign-in</h1> <div class="socialaccount_ballot"> <div class="text-center mb-3"> <ul class="list-unstyled"> <li> <a @click="sign_w3()" title="GitHub" class="socialaccount_provider github btn btn-secondary btn-lg w-100" href="JavaScript:void(0)">Connect With <strong>Meta Mask</strong></a> </li> <li> <a title="GitHub" class="socialaccount_provider github btn btn-secondary btn-lg w-100" href="https://github.com/login/oauth/authorize?client_id=249b69d8f6e63efb2590&redirect_uri=http://localhost:8000/github_back/">Connect With <strong>GitHub</strong></a> </li> </ul> </div> <div class="text-center text-muted my-3">— or —</div> </div> <div class="form-group"> <div id="div_id_login" class="form-group"> <label for="id_login" class=" requiredField"> Email<span class="asteriskField">*</span> </label> <div class=""> <input type="email" v-model="email" placeholder="" autocomplete="email" autofocus="autofocus" class="textinput textInput form-control" > </div> </div> </div> <div class="form-group"> <div id="div_id_password" class="form-group"> <label for="id_password" class=" requiredField"> Password<span class="asteriskField">*</span> </label> <div class=""> <input type="password" v-model="password" placeholder="" autocomplete="current-password" minlength="8" maxlength="99" class="textinput textInput form-control" > </div> </div> </div> <div class="text-center"> <button class="btn btn-primary btn-lg text-wrap px-5 mt-2 w-100" name="jsSubmitButton" @click="sign_on">Sign-In</button> </div> </div> </div> </div> </div> {% include "foot.html" %} </div> <script> const App = { data() { return { email:"", password:"", provider:null, accountAddress:"", signer:null created: function() { methods: { //metamask登录 sign_w3:function(){ that = this; ethereum.enable().then(function () { this.provider = new ethers.providers.Web3Provider(web3.currentProvider); this.provider.getNetwork().then(function (result) { if (result['chainId'] != 1) { console.log("Switch to Mainnet!") } else { // okay, confirmed we're on mainnet this.provider.listAccounts().then(function (result) { console.log(result); this.accountAddress = result[0]; // figure out the user's Eth address this.provider.getBalance(String(result[0])).then(function (balance) { var myBalance = (balance / ethers.constants.WeiPerEther).toFixed(4); console.log("Your Balance: " + myBalance); // get a signer object so we can do things that need signing this.signer = provider.getSigner(); var rightnow = (Date.now()/1000).toFixed(0) var sortanow = rightnow-(rightnow%600) this.signer.signMessage("Signing in to "+document.domain+" at "+sortanow, accountAddress, "test password!") .then((signature) => { that.handleAuth(accountAddress,signature); console.log(this.signer); //检查验证 handleAuth:function(accountAddress, signature){ this.myaxios("/checkw3/","post",{"public_address":accountAddress,"signature":signature}).then(data =>{ if(data.errcode==0){ alert("欢迎:"+data.public_address); localStorage.setItem("token",data.token); localStorage.setItem("email",data.public_address); window.location.href = "/"; }else{ alert("验证失败"); sign_on:function(){ if(this.email == ""){ alert("邮箱不能为空"); return false; if(this.password == ""){ alert("密码不能为空"); return false; this.myaxios("/user_signon/","get",{"email":this.email,"password":this.password}).then(data =>{ if(data.errcode != 0){ alert(data.msg); }else{ alert(data.msg); localStorage.setItem("token",data.token); localStorage.setItem("email",data.email); window.location.href = "/"; //localStorage.removeItem("token") const app = Vue.createApp(App); app.config.globalProperties.myaxios = myaxios; app.config.globalProperties.axios = axios; app.config.compilerOptions.delimiters = ['${', '}'] app.mount("#app"); </script> </body> </html>Tornado后端验签:有人说,既然钱包私钥是存储在浏览器中,也就是保存在客户端,那签名已经通过私钥生成了,为什么还要过一遍后端呢?这不是多此一举吗?事实上,攻击者完全可能获取到前端生成的所有信息,所以签名一定必须得是后端提供,或者至少有一步后端验证,比如著名的微信小程序获取openid问题。后端我们使用异步框架Tornado,配合web3库进行调用,首先安装依赖:pip3 install tornado==6.1 pip3 install web3==5.29.1随后创建异步视图方法: from tornado.web import url import tornado.web from tornado import httpclient from .base import BaseHandler from web3.auto import w3 from eth_account.messages import defunct_hash_message import time class CheckW3(BaseHandler): async def post(self): public_address = self.get_argument("public_address") signature = self.get_argument("signature") domain = self.request.host if ":" in domain: domain = domain[0:domain.index(":")] now = int(time.time()) sortanow = now-now%600 original_message = 'Signing in to {} at {}'.format(domain,sortanow) print("[+] checking: "+original_message) message_hash = defunct_hash_message(text=original_message) signer = w3.eth.account.recoverHash(message_hash, signature=signature) if signer == public_address: user = await self.application.objects.get(User,email=public_address) except Exception as e: user = await self.application.objects.create(User,email=public_address,password=create_password("third"),role=1) myjwt = MyJwt() token = myjwt.encode({"id":user.id}) self.finish({"msg":"ok","errcode":0,"public_address":public_address,"token":token}) else: self.finish({"msg":"could not authenticate signature","errcode":1}) 这里通过recoverHash方法对签名进行反编译操作,如果反编译后的钱包地址和前端传过来的钱包地址吻合,那么说明当前账户的身份验证通过:当验签通过之后,利用钱包地址在后台创建账号,随后将钱包地址、token等信息返回给前端,前端将其保存在stroage中即可。结语没错,将至已至,未来已来,是时候将Web3.0区块链技术融入产品了,虽然有些固有的思维方式依然在人们的脑海挥之不去,但世界却在时不我待地变化着,正是:青山遮不住,毕竟东流去!项目开源在https://github.com/zcxey2911/Tornado6\_Vuejs3\_Edu ,与君共觞。

小波从此逝,江海寄余生,不但是文坛巨擘还是不世出的编程奇才,天才程序员王小波

二十六年前,王小波先生因病于北京逝世,享年四十四周岁。喜爱他的人,都知道他是一个特立独行的人,拥有谦虚与自豪并存的强大气质,并且留下无数传世作品,无可争议的文坛巨擘,他的力量、有趣,对媚众形式束缚的反抗,以及一以贯之的,对待生活无比真诚的态度都让我们为之倾倒。然而,鲜为人知的是,他不仅仅在文学上造诣非凡,与此同时,他还是一位不世出的编程奇才。在整个九十年代,除了和文字跳舞,王小波还将他的才华通过键盘喷涌而出,天才的脑细胞幻化为一行一行的代码, 挥洒自如,回转如意。王小波在编程领域的惊人艺业,我们也许可以通过他的书信以及著作中的内容略窥一二。1988年12月,致好友刘晓阳:回来之前我曾往人大一分校计算机站写过一封信,问他们可要带什么软件,主管的工程师回了封信,我没收到。回来之后人家还提到此事。现在国内软件一面混乱,又逐渐有形成市场之势。首先以年兄学统计这一事实来看,回来做事非有会用的软件不可。 Macintosh 根本就没打进中国市场,你非带几个可用的IBM微机软件回来不可。至于什么机器上能使倒不必太担心。我这个狗屁计算机室,IBMPS/2就有二台。AT机也不少。SAS SPSS Statistx都有,可代表国内上等一般统计微机房的水平,可就是少了一种宜于作统计的语言。 年兄如有 APL(A Programming Language)之IBM微机本,可给我寄copy来。我在美还有一个户头,连manual复印费一并写支票给你们。Glim我也没有,如年兄有便人可捎来。邮寄太贵,能省就省吧。Macintosh就是苹果电脑,1998年后多被简称为Mac,可见在八十年代刘晓阳就已经用上mac电脑了,但是国内还是以IBM计算机为主流。信中提到的 SAS,SPSS,Statistx,Glim 都是统计学软件,其中 SAS 作为商业软件到现在依然被普遍使用。而值得一提的是,APL (A Programming Language)是一个对二十一世纪的程序员来说不太寻常的计算机语言,APL 最著名的就是它使用一组非ASCII符号。这些符号比一般常见的代数和计算符号要多。有人开玩笑说,用两行这样的奇形怪状的符号就可以将所有航空控制的问题解决了。事实上,在一些APL版本中,用一行代码就可以将任何可计算的函数表达出来,再用一行代码就可以将这个函数的结构表达出来。由于它的精密的结构和非标准的符号,也有人将APL称为“只写语言”。除数学家外,其他人要读APL写的程序都感到非常困难。有些数学家觉得其它语言比APL难懂。由于APL使用不寻常的符号,许多专业程序员在写APL程序时使用专门的APL键盘。而王小波在八十年代就开始使用APL进行编程了。1990年1月,致好友刘晓阳:我现在正给北大社会学所做统计,手上除SPSS没有可用的软件,国内这方面很差。我现在会用FORTRAN,编统计程序不方便。闻兄谈起你们用 S语言,不知是否好用。工具书也不知好找不。不管好歹,烦兄找个拷贝给我,要就算了。照我看只要能解决各种矩阵运算就够:当然也要有各种分布函数。反正也是瞎胡混,我就算努把力,少混点吧。整个八十年代,国内普通用户很难连上互联网,所以学习语言只能看书, 信中提到的S 语言 是 1976 年由贝尔实验室设计的统计学编程语言。不过王小波提到的 S 语言可能指的是 1988 年以后的新版 S 语言。二十一世纪,在统计学领域中被广泛使用的 R语言 是 S 语言的后继者。R语言递归实现斐波那契:recurse_fibonacci <- function(n) { if(n <= 1) { return(n) } else { return(recurse_fibonacci(n-1) + recurse_fibonacci(n-2)) }1991年3月,致好友刘晓阳:你寄来的严氏2.0A我也收到,还没用。因为一者是3盘要倒,二者我自己写的WK也有重大进展。我也自做了词组功能,是棵B树,我觉得自写的软件自用,感觉是最好的。词组用处不是很大,主要用于定义人地名等专有名词,但是严氏软件对我还是有重大启示,拼音加四声是个极好的主意,写起东西来声韵铿锵,与其他软件大不一样。自写一遍,从分页到编辑键分配,都能合乎自家习惯,不是存心狗尾续貂也。如能见到严氏,可代为致意。信中提到的严氏 2.0 应该是当时某个中文输入和处理的软件。因为在其他信件中,王小波提到的 “WK” 是对严氏的仿制,而 WK 是王小波用 C 语言实现的,用来在电脑上写小说。这里王小波提到 WK 用 B 树实现了词组查询,应该是为了方便自己中文常用词组输入。事实上,在那个年代,懂得用B-tree优化查找效率的人应该是凤毛麟角,而王小波恰恰是其中一位,这说明王小波在90年代初就懂得使用算法和数据结构来优化自己的程序了,他可能不会想到,2022年的今天,会有数以万计的程序员会在一个叫做leetcode的平台上苦练算法和数据机构,以期一个研发职位。1992年1月,致好友刘晓阳:编译程序一盘(有说明书,见shou),源程序一盘。我的音典与严氏同名内容不同。功能上与严氏的近似,但是多了改进拼音字典的功能。按F4后可以把拼音重定义。也可加字,在拼音拣字时,按enter,就进入国标拣字,拣到的字加入字典。 这个软件由五个c语言(另有两个头文件)和一个汇编语言文件组成,可用 turboc编译。假如你用过其它c软件,有一点要提醒你,turbo.c有一种极讨厌的特性,就是你在一个函数内alloc的内存,退出该函数时不会自动释放;还有一点也很糟,就是模型问题,在大模型下写的程序,到了小模型上一概不能用,我的程序是在compact模型下写的,就不能用small来编译,这两条是可以气死人的。据说可以用far,near之类的前缀说明指针,其实是屁用不管。我干了一年多c,得到的结论是微机c还不能使人快乐,有时叫人怀念汇编。这封信他提到 WK 要用 Turbo C 这个 C 编译器来构建。Turbo C 是 Borland 公司 1987 年发布的 C 语言集成开发环境,Turbo C 这样的老 C 编译器是面向 DOS 的,因此提供了不同的内存模式。简单来说就是不同模式下指针基础偏移不同,compact 中的指针在 small 环境下可能会指向错误的地址,所以“大模型下写的程序,到了小模型上一概不能用”。现代程序员如果在应用层开发功能,底层只需要考虑内存溢出问题,而在那个遥远的年代,程序员不仅仅需要考虑整体内存容量问题,还需要解决内存单位的布局问题,这对于现代程序员来说,其复杂度是难以想象的。最后,王小波在其著作《革命时期的爱情》第三章第四节有如下记述:锁在房子里时,精力能够集中。所以我编的第一批软件极有诗意。李后主有词云: 红豆啄残鹦鹉粒。 我的软件就曲折和弹性而言,达到了此句的境界。后主又有残句云: 细雨流湿光。我的软件就有这么简约,别人编十行,我只用一行。等到交活时,教授看了吃一惊:这么短!能跑(run)吗?我说你试试嘛。试完了他和我握手道:谢谢!但是到了开支时,我的钱比别人都少。原来是按行算钱,真把我气死了。等到交第二批软件时,我就吃棉花屙线屎。古诗云: 一个和尚独自归, 关门闭户掩柴扉。我的第二批软件到了这种境界。简言之,别人编一行,我就编了二十行。等到交活时,教授根本不问能不能run,只说:你这是捣蛋!就打回来让我改短。资本主义就是这么虚伪。等到拿了学位,我毫不犹豫就回国来。这是因为我从骨子里来说是个浪漫诗人,作画时是个颜色诗人,写程序时是个软件诗人。干瘪无味的资本主义社会哪里容得下浪漫诗人。这一段简直就是现代编码风格的圭臬,戏谑和深刻齐飞,让人忍俊不禁,乐而忘返。能将感性的唐诗宋词和理性的机械代码相结合,还出落的如羚羊挂角无迹可寻的圆润,恐怕,普天之下,也只有王小波可以做到了。结语:他拥有诗意而有趣的灵魂,他用文字对抗虚无,他用代码实现理念, 他用行动践行了自己的誓言,并一直战斗到死。他的身前是万家灯火,他的身后是繁星闪烁,我们都在守候,我们却又都在送别,小波从此逝,江海寄余生。

轻盈潇洒卓然不群,敏捷编辑器Sublime text 4中文配置Python3开发运行代码环境(Win11+M1 mac)

20世纪初,几乎所有的飞机都是并列双翼结构,此时,美国著名飞行大亨霍华德·休斯认为自己的飞机不够快,助手委婉地提醒他,如果速度太快,飞机的上翼结构支柱很可能会支撑不住,发生断裂。霍华德愤怒地向助手大喊:“谁说我们需要上翼结构?让上翼和支柱见鬼去吧,我们需要的是更轻便的单翼飞机!”于是乎,H1单翼飞机就此出现,这款机型身上体现了霍华德作为一名航空工程师的天才之处:突破性的流线型机身,可收放起落架,轻巧灵动,平面的铆钉和接头以减少空气阻力,因其优美的造型被称为“银色子弹”。同样地,如果你入职了一家公司,当主管拍拍你的肩膀让你往电脑里安装Pycharm的时候,你也可以愤怒地向他大喊:“谁说我们需要Pycharm?让笨重的IDE都见鬼去吧,我只要轻便的Sublime text 4 !”是的,轻便优雅,不是所有人都喜欢披盔戴甲,重装上阵。如果你偏爱轻灵机巧,编写代码恰如春日双燕飞舞柳间,高低左右,回转如意,那么Sublime text 4会是你的最佳选择。Win11系统配置Sublime text 4首先来到Win11环境下,进入Sublime text 4官网的下载页面:https://www.sublimetext.com/download选择Win版本的安装包:下载成功后,双击安装即可。随后,需要安装Python3的安装包,这里推荐3.10最新版本,由于之前安装过,这里就不赘述了,如果是没有安装过Python3的朋友,请移玉步至:一网成擒全端涵盖,在不同架构(Intel x86/Apple m1 silicon)不同开发平台(Win10/Win11/Mac/Ubuntu)上安装配置Python3.10开发环境。Sublime Text 4 是一个扩展性极高的编辑器,所有功能可以使用称为Package Control的插件进行扩展。要安装、更新和管理软件,我们需要在 Sublime Text 4上安装 Package Control。打开Sublime Text 4 ,选择 菜单 -> menu Tools -> Install Package Control 进行安装:安装过程需要等待一小会,一旦安装成功,会有相应的提示信息:下面我们就可以利用Package Control安装一些扩展软件了,可以通过 菜单 -> Install Package Control option 来激活安装命令行,也可以通过快捷键 Ctrl+Shift+P 来激活安装命令行:随后输入install后选择install package 回车选择。在安装搜索框里,键入:Chinese 选择 ChineseLocalizations 回车安装中文扩展。安装好以后,我们的Sublime Text 4就可以支持中文显示了。接着安装Python3的扩展,和Sublime Text 3 配置Python3不同的是,Sublime Text 4 只需要一个插件即可以运行Python3,那就是 AnacondaCtrl+Shift+P 来激活安装命令行 install package 键入:Anaconda安装好以后,新建一个test.py:def mytest(): print("Hello Sublime Text 4 ! ") if __name__ == "__main__": mytest()利用快捷键 ctrl + b 就可以直接运行代码:Hello Sublime Text 4 ! [Finished in 152ms]非常方便,当然了,由于Anaconda的语法检测相对严格,会出现一些“白框”的提示,我们可以用过Sublime Text 4的 首选项 -> Package Settings -> Anaconda -> Settings-User 进行设置,打开配置文件后键入:{"anaconda_linting":false}保存设置以后,不会出现白框,并且可以通过Anaconda进行自动补全:和 Sublime Text 3 相比,Sublime Text 4 配置 Python3 相对快捷方便了很多,只需要Chinese和Anaconda这两个插件即可:当然了,插件可以进行安装,同时也支持卸载。Ctrl+Shift+P 来激活安装命令行 Remove package 然后选择需要卸载的插件即可。除了可以针对安装的软件进行配置,也可以单独修改Sublime Text 4的配置,选择 首选项 -> 快捷键设置:[ "keys": ["alt+l"], "command": "toggle_setting", "args": "setting": "line_numbers" ]我们就可以使用 alt + l 的快捷键来控制Sublime Text 4的行号显示。如果愿意,我们还可以通过官网安装Sublime Text 4的Git 扩展 SublimeMerge : https://www.sublimemerge.com/download这样,我们就可以在Sublime Text 4 操作 代码的分支、提交、以及推送了:M1 Mac 系统 配置 Sublime text 4回到Mac,https://www.sublimetext.com/download 选择 mac 的压缩包:注意,即使是M1芯片的Mac也不要选择底下的ARM64版本,同样是选择MacOS版本,注意下载成功之后并不是dmg扩展的安装包,而是sublime\_text\_build\_4126\_mac.zip,里面是软件本体,需要进行解压操作。解压之后,将软件本体直接拖动到应用程序目录中即可:随后,同样选择 菜单 -> menu Tools -> Install Package Control 进行安装安装成功后,激活命令行的快捷键变成了 Command+Shift+P同时运行代码的快捷键变成了 Command + b修改配置文件也换到了 Preferences 中:设置完毕以后,我们就可以享受Sublime text 4带给我们的极速编码之旅了,Enjoy it!Sublime text 4 也新增了针对GPU加速的界面渲染,理论上来说输入延迟可以进一步降低。市面上没有比它输入延迟低,反应更迅速,资源占用更少的编辑器了,丝滑顺畅,反观Pycharm/VSCode对硬件的要求很高,需要好的CPU/内存支持,另外如果在内网环境,没有公网,这种情况下Sublime text 4离线导入配置,导入插件的体验是更好的,直接配置文件夹打包拿到内网就直接用了,而vscode对网络的依赖度还是比较高的,折腾麻烦。结语:是的,也许你的主管会和你说:“组里的同事都在用Pycharm,你为什么不用?”,毫无疑问,人往往是按照别人的期待,活成别人希望的样子,最后丢掉自己。你愿意低质量的合群、讨好别人,还是一个人独处、坚持做自己?这个时候也许你更该问自己一个问题:Why are you trying so hard to fit in when you were born to stand out ?如果你生来与众不同,何苦非要融入这群乌合之众呢?

孔雀折翼空中浩劫,东航MU5735航班高空垂直骤降八千米坠毁失事原因技术性分析

公元2022年3月21日北京时间下午2点22分,东航MU5735航班(昆明至广州航段),以接近音速的速度和近乎垂直的角度,高速俯冲地面坠毁。坠毁位置位于广西省梧州市埌南镇莫埌村神塘表附近的山林中,坠毁前,该航班并未挂出“7700”紧急代码。北京时间下午1点16分,MU5735航班顺利起飞,并在升空的十分钟后飞行至海拔7500米左右的高度。此后,这架航班以每小时830公里左右的飞行速度途径云南百色市、广西南宁市和玉林市的上空,前往广州。MU5735航班在持续飞行近63分钟后,其航行高度突然从海拔约8870米的高度下降至约2770米,时长约持续了2分15秒。当时,这架飞机依然以每小时约842公里的速度飞行。此后的一分钟,这架波音737-800客机的飞行高度持续下降,系统最后捕捉到的是其在海拔983米的高度以每小时696公里的速度飞行时留下的记录。该航班机型为波音737-800(NG),机龄6.8年,编号为B1791,波音737-800是由美国波音公司设计并且研发的,其长39.5米,高12.5米,翼展35.79米,可选高达2.60米的翼尖小翼,最大可搭载189名乘客,机舱内逃生门每边2扇。飞行数据从该航班的高度表可以看出,这架飞机最后坠毁的时候,数据并不正常,它从8000多米高空极快地坠落,几乎可以被理解为飞机完全失去了上升的可能性,机头垂直向下,完全没有拉起。监控画面可见,当飞机坠落的时候,它的状态还是完整的,不是变成一堆部件掉下来的。那么这种状况下,如果飞行员对飞机还有控制的话,其通过调整飞机的各个控制面,比如说机翼、副翼或者升降舵,飞机不应该是这样下落的,它至少应该是以一个滑翔或者说是斜角度的俯冲来接近地面的。如果说头下尾上,直直地冲下来的话,我们可以认为飞行员已经没有办法再控制飞机的控制面,飞行员的操纵动作不能改变飞机的姿态。软件因素既然,飞行员无法有效地控制飞机,可能有什么原因导致这种现象呢?飞行器中的计算机软件控制系统可能是诱因之一。公元2019年3月10日,埃塞俄比亚航空一架波音737 MAX 8型客机(注册编号:ET-AVJ)载有157人从亚的斯亚贝巴飞往内罗毕的航班途中,于起飞后6分钟,同样近乎垂直角度高速坠毁于距离机场62千米外的小镇德布雷塞特,机上人员全部遇难。涉事飞机于2018年10月30日首飞,同年11月15日交付,机龄仅4个月。初步调查报告指出飞行员已完全遵从波音及美国联邦航空管理局发出的建议和指引去处理紧急情况,但仍无法修正控制系统持续压低机头,降低攻角的情况。喷气式飞机之所以能在空中飞行,是因为飞行过程中,机翼在不断“攻击”前方的空气。顾名思义,攻角(Angle of Attack,AOA)就是指机翼弦线与相对风向之间的夹角。在一定的范围内,攻角越大,气流对机翼“推动”作用向上的分量也就越大,从而飞机的升力越大。但是,如果攻角过大,升力就会急剧下降,从而发生失速。这个达到最大升力但即将失速的攻角称为临界攻角,阈值区间一般是十几到二十几度。说白了就是,飞机头抬的越高,飞机的升力越大,飞的越高,但是你不能无限抬机头,一旦机头抬起的角度过大,飞机形态变成近垂直,飞机会直接失去速度,从空中“掉”下来。那么,既然飞机攻角过大会导致失速,为防止失速,飞机安装了MCAS系统,也就是在飞机的攻角传感器检测到攻角超过临界攻角时,自动转动水平安定面,下压机头。也就是说,软件系统会对飞行姿态进行校正,校正指标就是攻角。说白了就是,MCAS系统会以攻角大小作为判别标准,如果系统觉得攻角过大,就会不停的下压机头进行校正,而机头越低,飞机的俯冲力就越大,会一直往下“扎”下去,所以这套系统致命的问题是没有考虑到,万一攻角检测是错的怎么办?事实上,埃航这次机毁人亡的事故原因就是飞机的攻角传感器出了问题,而计算机MCAS系统又只以攻角为唯一的判定标准,一直在下压机头,导致飞机近垂直角度俯冲地面坠毁。容错机制计算机系统中有个著名的拜占庭将军问题,就是几个军官之间在通信不可靠、军官中还可能有叛徒的情况下,如何保证忠诚的军官之间能达成一致的协议。拜占庭将军问题提出后,有很多的算法被提出用于解决这个问题。这类算法统称拜占庭容错算法(BFT: Byzantine Fault Tolerance)。简略来说,拜占庭容错(BFT)不是某一个具体算法,而是能够抵抗拜占庭将军问题导致的一系列失利的系统特点。 这意味着即使某些节点出现缺点或恶意行为,拜占庭容错系统也能够继续运转。本质上来说,拜占庭容错方案就是少数服从多数。说白了就是,不能以一个单一指标来判定系统行为。回到生活中来,相对于航空领域,我们显然更加熟悉汽车行业,现在越来越多的主机厂都以“配置高”为卖车的噱头,特别是“主动安全系统”,简直人手一套:所谓“主动安全系统”,就是在系统通过某些数值判定与前车距离是否过小,如果距离过小,系统会主动进行“校正”动作,也就是帮你踩刹车的一种行为。殊不知,如果这套系统没有做好容错机制,没有通过大量的算法指标来进行短时间判定,可能会造成一种“反行为”:跟车距离正常的情况下,红外线雷达或者纳米波雷达出了问题,它突然给你来一脚刹车,还是踩死的那种,这样,后车完全可能会因为你突然急刹,来不及反应而追尾你。本来“主动安全系统”是为了防止追尾,结果偏偏会导致“反向”追尾,根本原因就是没有做好容错机制,除了雷达距离,系统还应该有别的参照物作为算法依据来判定行为,所以,有计划买车的朋友最好不要一味追求“主动安全系统”等高配置,因为这种系统需要海量的数据作为参照,同时构建模型针对测试数据进行机器学习和训练,而没有大量累积过相关经验的主机厂很难将系统设计的平衡。同理,埃航飞机所搭载的MCAS系统本来是为了防止攻角过大导致失速坠毁,反而因为传感器故障,导致一味地压低机头而坠毁,显然不符合飞控系统的容错性要求。在此次事故后的软件更新中,波音的飞控系统将会比较攻角传感器的输入,如果相差超过5.5度就不会触发MCAS系统,这才解决了问题。系统设计实际上,带有容错机制的系统设计比比皆是,比如之前的一篇文章:基于Docker-compose搭建Redis高可用集群-哨兵模式(Redis-Sentinel),sentinel(哨兵)系统可以监视一个或者多个redis master服务,以及这些master服务的所有从服务;当某个master服务下线时,自动将该master下的某个从服务升级为master服务替代已下线的master服务继续处理请求。然而,这套系统的核心其实是,到底系统怎么判定主机“下线了”?事实上,哨兵会把“下线”分为两种状态。主观下线:一个哨兵如果自己觉得一个 master 下线了,那么就是主观下线。客观下线:如果一定数量的哨兵都觉得一个 master 下线了,那么就是客观下线,客观下线即真实下线。主观下线达成的条件很简单,如果一个哨兵 ping 一个 master,超过了指定的断开毫秒数之后,就主观认为 master 下线。如果一个哨兵在指定时间内,收到了一定指定数量的其他哨兵也认为那个 master 是 主观下线,那么就认为是客观下线了,客观认为 master 宕机,所以立刻切换主机。这样采用少数服从多数的拜占庭容错算法系统,就可以大概率避免由于系统判断错误导致的主库突然进行切换,从而产生两个“主库”的问题。回到飞行系统上,无论是“攻角”问题,还是“失速”问题,或者是一些其他的什么问题,都应该搭配一套完备的拜占庭容错算法系统,即飞行过程中,无论出现了什么异常情况,判别标准不应该是死的,或者是某一方的主观认定,而是通过少数服从多数的“民主”判定方法,比如飞行系统会给出判定,机长也可以给出自己的判定,副机长同样可以给出判定,遵循少数服从多数原则,从而避免误操作行为。当然了,此次坠毁事故中,东航的波音737-800和埃航的737 MAX并不是同一机型,737-800也并未搭载波音的MCAS系统,但是我们也不能因此就排除软件系统故障的原因,毕竟,随着物联网、自动驾驶、工业自动化和工业互联网的普及,越来越多重要甚至生命攸关的系统将使用软件进行自动控制。机械故障除了软件因素,硬件层面的机械故障也可能是诱因之一。根据东航MU5735的高度曲线,和阿拉斯加航空261号航班事故很相像,公元2000年1月31日当地时间下午4点20分,失事飞机为一架MD-83(麦克唐纳-道格拉斯(McDonnell Douglas),一般简称为麦道),在经历了灾难性的俯仰失控之后,飞机坠毁于加利福尼亚州安那卡帕岛以北4.3公里的太平洋里。飞机上的2名飞行员、3名乘务员以及83位乘客全部遇难。事后调查查明飞机机件维修不足导致过度磨损,并在飞行过程中引发了飞行控制系统灾难性故障。空难的可能原因是“因飞行途中水平尾翼平衡调节系统起重螺杆顶部螺母的螺纹失效而造成俯仰失控。螺母螺纹的失效归因于阿拉斯加航空对其所装配起重螺杆装备润滑不足导致的过度磨损。”说白了就是,飞机的水平尾翼(horizontal tail)坏了,水平尾翼的工作原理很简单:当我们需要操纵飞机抬头或低头时,水平尾翼中的升降舵就会发生作用。 升降舵是水平尾翼中可操纵的翼面部分,其作用是对飞机进行俯仰操纵。 当需要飞机抬头向上飞行时,驾驶员就会操纵升降舵向上偏转,此时升降舵所受到的气动力就会产生一个抬头的力矩,飞机就抬头向上了,同理,当水平尾翼向下偏转的时候气动力就会产生一个压头的力矩,让飞机压低机头向下飞,水平尾翼向下角度越大,飞机越往下“扎”:而阿拉斯加航空261号就是水平尾翼故障,一直向下偏转,导致机头一直下压,没有办法把飞机拉起来,随后也是近乎垂直的俯冲到了海面上,发生坠毁。水平翻转有人说,出现这种器质性的机械损坏故障,就只能认命,没有任何补救方案了吗?别太笃定,2012年由丹泽尔·华盛顿主演的《迫降航班》给了我们答案。飞行员在水平尾翼升降舵卡死的情况下把飞机翻转了180度,在近地面机腹向上飞行了一段时间,飞机的横滚倒飞有效抑制了高速俯冲,同时给迫降提供了时间:原理就是,升力的产生不仅和机翼形状有关系,还和机翼相对飞行方向的角度(即上文提到的攻角)有关系。比如飞机正常飞行时,拼命压头的话,机翼受到气流的力就有可能是朝下了;同样的道理,倒飞时也有可能产生向上的升力,只不过这是非常规的操作,飞行手册上并未记载该解决方案。但这种操作也需要一些客观条件,比如飞机要有足够高的高度和足够快的速度,还要剩有足够的动力,影片中机长在近地面2100米高度,几秒内马上要坠毁的情况下,冷静的丢弃备用油箱减轻重量,收起襟翼和起落架减少风阻,同时加大油门,果断横滚翻转,终于在近地面550米高度,扶大厦于将倾,挽狂澜于既倒。不要认为这都是电影的演绎,早在1955年,波音707客机的原型机试飞,波音试飞员Tex Johnston在没有预先告知公司高层的情况下,擅自驾机完成了一个横滚翻转动作,技惊四座:回到东航MU5735航班事故上,当时飞机以每小时842公里的速度从海拔约8870米的高度压头俯冲,整个机组成员只有不到半分钟的反应时间,在这半分钟内,正副机长需要达成一致,果断进行横滚翻转动作,过程中飞机引擎必须持续提供高动力输出,同时737的机身强度还得保证不在空中解体,毫无疑问,这是个不可能完成的任务,再不事先预知的情况下,再优秀的飞行员也无力回天。结语:德国人帕布斯·海恩提出一个在航空界关于飞行安全的法则,海恩法则指出: 每一起严重事故的背后,必然有29次轻微事故和300次未遂先兆以及1000次事故隐患。 法则强调两点:一是事故的发生是量的积累的结果;二是再好的技术,再完美的系统,在实际操作层面,也无法取代人自身的素质和责任心。无论是飞机软件控制的问题,还是硬件机械的问题,我们都希望,东航MU5735的悲剧,不再重演。

并发异步编程之争:协程(asyncio)到底需不需要加锁?(线程/协程安全/挂起/主动切换)Python3

协程与线程向来焦孟不离,但事实上是,线程更被我们所熟知,在Python编程领域,单核同时间内只能有一个线程运行,这并不是什么缺陷,这实际上是符合客观逻辑的,单核处理器本来就没法同时处理两件事情,要同时进行多件事情本来就需要正在运行的让出处理器,然后才能去处理另一件事情,左手画方右手画圆在现实中本来就不成立,只不过这个让出的过程是线程调度器主动抢占的。线程安全系统的线程调度器是假设不同的线程是毫无关系的,所以它平均地分配时间片让处理器一视同仁,雨露均沾。但是Python受限于GIL全局解释器锁,任何Python线程执行前,必须先获得GIL锁,然后,每执行100条字节码,解释器就自动释放GIL锁,让别的线程有机会执行。这个GIL全局解释器锁实际上把所有线程的执行代码都给上了锁,所以,多线程在Python中只能交替执行,即使多个线程跑在8核处理上,也只能用到1个核。但其实,这并不是事情的全貌,就算只能用单核处理任务,多个线程之前也并不是完全独立的,它们会操作同一个资源。于是,大家又发明了同步锁,使得一段时间内只有一个线程可以操作这个资源,其他线程只能等待:import threading balance = 0 def change_it_without_lock(n): global balance # 不加锁的话 最后的值不是0 # 线程共享数据危险在于 多个线程同时改同一个变量 # 如果每个线程按顺序执行,那么值会是0, 但是线程时系统调度,又不确定性,交替进行 # 没锁的话,同时修改变量 # 所以加锁是为了同时只有一个线程再修改,别的线程表一定不能改 for i in range(1000000): balance = balance + n balance = balance - n def change_it_with_lock(n): global balance if lock.acquire(): for i in range(1000000): balance = balance + n balance = balance - n # 这里的finally 防止中途出错了,也能释放锁 finally: lock.release() threads = [ threading.Thread(target=change_it_with_lock, args=(8, )), threading.Thread(target=change_it_with_lock, args=(10, )) lock = threading.Lock() [t.start() for t in threads] [t.join() for t in threads] print(balance)这种异步编程方式被广大开发者所认可,线程并不安全,线程操作共享资源需要加锁。然而人们很快发现,这种处理方式是在画蛇添足,处理器本来同一时间就只能有一个线程在运行。是线程调度器抢占划分时间片给其他线程跑,而现在,多了把锁,其他线程又说我拿不到锁,我得拿到锁才能操作。就像以前的公共电话亭,本来就只能一个人打电话,现在电话亭上加了把锁,还是只能一个人打电话,而有没有锁,有什么区别呢?所以,问题到底出在哪儿?事实上,在所有线程相互独立且不会操作同一资源的模式下,抢占式的线程调度器是非常不错的选择,因为它可以保证所有的线程都可以被分到时间片不被垃圾代码所拖累。而如果操作同一资源,抢占式的线程就不那么让人愉快了。协程过了一段时间,人们发现经常需要异步操作共享资源的情况下,主动让出时间片的协程模式比线程抢占式分配的效率要好,也更简单。从实际开发角度看,与线程相比,这种主动让出型的调度方式更为高效。一方面,它让调用者自己来决定什么时候让出,比操作系统的抢占式调度所需要的时间代价要小很多。后者为了能恢复现场会在切换线程时保存相当多的状态,并且会非常频繁地进行切换。另一方面,协程本身可以做成用户态,每个协程的体积比线程要小得多,因此一个进程可以容纳数量相当可观的协程任务。import asyncio balance = 0 async def change_it_without_lock(n): global balance balance = balance + n balance = balance - n loop = asyncio.get_event_loop() res = loop.run_until_complete( asyncio.gather(change_it_without_lock(10), change_it_without_lock(8), change_it_without_lock(2), change_it_without_lock(7))) print(balance)从代码结构上看,协程保证了编写过程中的思维连贯性,使得函数(闭包)体本身就无缝保持了程序状态。逻辑紧凑,可读性高,不易写出错的代码,可调试性强。但归根结底,单核处理器还是同时间只能做一件事,所以同一时间点还是只能有一个协程任务运行,它和线程的最主要差别就是,协程是主动让出使用权,而线程是抢占使用权,即所谓的,协程是用户态,线程是系统态。同时,如图所示,协程本身就是单线程的,即不会触发系统的全局解释器锁(GIL),同时也不需要系统的线程调度器参与抢占式的调度,避免了多线程的上下文切换,所以它的性能要比多线程好。协程安全回到并发竞争带来的安全问题上,既然同一时间只能有一个协程任务运行,并且协程切换并不是系统态抢占式,那么协程一定是安全的:import asyncio balance = 0 async def change_it_without_lock(n): global balance balance = balance + n balance = balance - n print(balance) loop = asyncio.get_event_loop() res = loop.run_until_complete( asyncio.gather(change_it_without_lock(10), change_it_without_lock(8), change_it_without_lock(2), change_it_without_lock(7))) print(balance)运行结果:0 liuyue:as-master liuyue$看起来是这样的,无论是执行过程中,还是最后执行结果,都保证了其状态的一致性。于是,协程操作共享变量不需要加锁的结论开始在坊间流传。毫无疑问,谁主张,谁举证,上面的代码也充分说明了这个结论的正确性,然而我们都忽略了一个客观事实,那就是代码中没有“主动让出使用权”的操作,所谓主动让出使用权,即用户主动触发协程切换,那到底怎么主动让出使用权?使用 await 关键字。await 是 Python 3.5版本开始引入了新的关键字,即Python3.4版本的yield from,它能做什么?它可以在协程内部用await调用另一个协程实现异步操作,或者说的更简单一点,它可以挂起当前协程任务,去手动异步执行另一个协程,这就是主动让出“使用权”:async def hello(): print("Hello world!") r = await asyncio.sleep(1) print("Hello again!")当我们执行第一句代码print("Hello world!")之后,使用await关键字让出使用权,也可以理解为把程序“暂时”挂起,此时使用权让出以后,别的协程就可以进行执行,随后当我们让出使用权1秒之后,当别的协程任务执行完毕,又或者别的协程任务也“主动”让出了使用权,协程又可以切回来,继续执行我们当前的任务,也就是第二行代码print("Hello again!")。了解了协程如何主动切换,让我们继续之前的逻辑:import asyncio balance = 0 async def change_it_without_lock(n): global balance balance = balance + n await asyncio.sleep(1) balance = balance - n print(balance) loop = asyncio.get_event_loop() res = loop.run_until_complete( asyncio.gather(change_it_without_lock(10), change_it_without_lock(8), change_it_without_lock(2), change_it_without_lock(7))) print(balance)逻辑有了些许修改,当我对全局变量balance进行加法运算后,主动释放使用权,让别的协程运行,随后立刻切换回来,再进行减法运算,如此往复,同时开启四个协程任务,让我们来看一下代码运行结果:17 liuyue:mytornado liuyue$可以看到,协程运行过程中,并没有保证“状态一致”,也就是一旦通过await关键字切换协程,变量的状态并不会进行同步,从而导致执行过程中变量状态的“混乱状态”,但是所有协程执行完毕后,变量balance的最终结果是0,意味着协程操作变量的最终一致性是可以保证的。为了对比,我们再用多线程试一下同样的逻辑:import threading import time balance = 0 def change_it_without_lock(n): global balance for i in range(1000000): balance = balance + n balance = balance - n print(balance) threads = [ threading.Thread(target=change_it_without_lock, args=(8, )), threading.Thread(target=change_it_without_lock, args=(10, )), threading.Thread(target=change_it_without_lock, args=(10, )), threading.Thread(target=change_it_without_lock, args=(8, )) [t.start() for t in threads] [t.join() for t in threads] print(balance)多线程逻辑执行结果:liuyue:mytornado liuyue$ python3 "/Users/liuyue/wodfan/work/mytornado/test.py" 8可以看到,多线程在未加锁的情况下,连最终一致性也无法保证,因为线程是系统态切换,虽然同时只能有一个线程执行,但切换过程是争抢的,也就会导致写操作被原子性覆盖,而协程虽然在手动切换过程中也无法保证状态一致,但是可以保证最终一致性呢?因为协程是用户态,切换过程是协作的,所以写操作不会被争抢覆盖,会被顺序执行,所以肯定可以保证最终一致性。协程在工作状态中,主动切换了使用权,而我们又想在执行过程中保证共享数据的强一致性,该怎么办?毫无疑问,还是只能加锁:import asyncio balance = 0 async def change_it_with_lock(n): async with lock: global balance balance = balance + n await asyncio.sleep(1) balance = balance - n print(balance) lock = asyncio.Lock() loop = asyncio.get_event_loop() res = loop.run_until_complete( asyncio.gather(change_it_with_lock(10), change_it_with_lock(8), change_it_with_lock(2), change_it_with_lock(7))) print(balance)协程加锁执行后结果:liuyue:mytornado liuyue$ python3 "/Users/liuyue/wodfan/work/mytornado/test.py" 0是的,无论是结果,还是过程中,都保持了其一致性,但是我们也付出了相应的代价,那就是任务又回到了线性同步执行,再也没有异步的加持了。话说回来,世界上的事情本来就是这样,本来就没有两全其美的解决方案,又要共享状态,又想多协程,还想变量安全,这可能吗?协程是否需要加锁结论当然就是看使用场景,如果协程在操作共享变量的过程中,没有主动放弃执行权(await),也就是没有切换挂起状态,那就不需要加锁,执行过程本身就是安全的;可是如果在执行事务逻辑块中主动放弃执行权了,会分两种情况,如果在逻辑执行过程中我们需要判断变量状态,或者执行过程中要根据变量状态进行一些下游操作,则必须加锁,如果我们不关注执行过程中的状态,只关注最终结果一致性,则不需要加锁。是的,抛开剂量谈毒性,是不客观的,给一个健康的人注射吗啡是犯罪,但是给一个垂死的人注射吗啡,那就是最大的道德,所以说,道德不是空泛的,脱离对象孤立存在的,同理,抛开场景谈逻辑,也是不客观的,协程也不是虚空的,脱离具体场景孤立存在的,我们应该养成具体问题具体分析的辩证唯物思想,只有掌握了辩证的矛盾思维才能更全面更灵活的看待问题,才能透过现象,把握本质。

增效降本开源节流,2023年技术趋势前瞻(异步编程/容器技术)

2023初始,凛冬已至,疫情横跳, 环境繁复,君不见互联网大厂纷纷裁员,银根紧缩。这一切归结为两个字:成本。对于互联网企业来讲,除了最基本的工商财税,办公室、办公设备、人力、产品和公关等等,这一切都是成本。而在疫情因素侵入导致经济下滑的情况下,降本增效就已经成为2023开年很多企业管理者非常重视的 KPI指标,而降本也一定会成为2023年技术发展的一个必然趋势。降本增效,到底降什么本,增什么效,有何妙计?异步编程一直以来,异步编程都是最有经验的开发者的专长,他们孜孜不倦地研究着非线性执行流中的回调方法,念兹在兹的,不过就是有限资源下每秒处理请求数的提升。异步编程方式也许是开发者对自己的严格要求,但带来的收益无疑也是非常可观的,以Python的web开发领域为例:2022年web框架性能排行榜中,排名前十的无一例外全部是异步框架,所以,异步编程方式可以给我们带来什么?是更高的每秒处理请求数,而更高的每秒处理请求数又能带给我们什么?是更低的服务器成本。那么异步编程到底怎么帮我们节约资源呢?本质上,异步提升的是服务器的吞吐量,而并非系统的性能,因为,CPU密集型的异步任务和同步效率差不多,也就意味着异步这是资源利用率提升,而非系统性能真的提升了。如果使用同样的CPU资源,处理同样的资源也会花费同样的时间。假设同步意味着大量的阻塞,此时CPU的资源利用率较低,吞吐量也会同比降低。如果使用异步,那么CPU很容易就能拉满,此时吞吐量就会升高。打个比方,京津高速双向八车道,日均车流量近5万车次,这里有个大前提,所有车道都得有车通行才可以,可是如果车辆根本就不变道,单向通行只用其中的两个车道,那多出来的两个车道意义何在?无疑是成本的浪费,大多数情况下,高并发场景下的同步编程方式就是在浪费系统资源。再者,如果一套服务不能有效利用一台服务器的资源,那必然需要更多的服务器通过运行更多的应用实例来弥补需求缺口。例如一个百万日活服务应用,假设同步框架单台机器可以抗住400-500并发,大抵需要七台服务器才能堪堪挡住,如果使用 Python 异步框架,重构后由原来的 七台服务器削减至三台,成本骤降 57%。而一台8核16G,10M共享带宽的百度智能云BCC计算型服务器,预付费一年的价格大概为1.2万人民币。假设我们不考虑服务器硬件成本,那也会由此引发出效率成本的消耗。当服务器数量堆叠到一定规模后,如果不改进底层逻辑和实现,无脑加机器其实是徒劳,并且运维成本会骤然增加,大批量服务器的监控、持续集成、持续部署都将会是不小的开销。综上,想要降低成本,异步编程就会是那一把关键的锁钥,当然了,相应地,对于开发人员的综合业务能力的要求也会有一定的提高,也许有的人认为异步编程会降低开发效率,异步写法也会降低代码可读性,殊不知,学贵大成,不贵小用。如果觉得异步编程晦涩难懂,可读性差,也许应该自我归因,努力提升自己以适应新时代的编程方式才是王道。容器技术容器,解决了应用打包标准化以及发布标准化的问题。早年间虚拟机方式的标准化程度是远远不够的,Docker容器终结了这一问题。随着 Docker 的不断演进和推广,在应用编排、资源调度等层面又出现了新的问题,早期的 Docker Swarm、Mesos 和 Kubernetes 互相竞争,最后 Kubernetes 胜出,并带来了新的资源编排方面的事实标准。在2022年的今天, Kubernetes 已经成为一个事实标准。标准化以后,Kubernetes上的业务类型越来越丰富,从最初的无状态到后来的有状态,如今像人工智能这样比较复杂的计算引擎也都放在 Kubernetes 上了,这就是一个相互促进的过程,这上面的负载类型越来越多,整个 K8S 体系也确实变得越来越复杂,但它能够管理的东西也越来越多。如果所有用户都抛弃传统部署方式,完全使用容器,那容器的复杂度必然会提升。但对于互联网企业来讲,要做的就是在容器能做更多事情后去降低它的复杂度和成本,否则容器的门槛就会非常高。现在,无论是阿里云、百度云还是腾讯云都在考虑从智能运维角度做更多的努力。比如在集群管理方面,如何用智能运维的方式发现当前运行中的一些状况,并且能够给出处理办法。现在也有智能应用画像和资源画像方式提高资源利用率。综上,智能化运维也会是2022的一个基本技术趋势,而围绕容器的生态可以认为是未来最重要的技术走向之一,它必然会引起一系列的变化,包括企业内部组织的变化。我们会看到,不仅整个运维管理体系在变化,企业内部也会出现新的组织形态,比如 Google 提出的 SRE 团队就是为可用性负责。现在很多深度使用云原生的企业,包括阿里,都有专门的 SRE 团队,这个团队会负责整个可用性相关能力的建设。其次,企业也会出现一些平台横向性的部门,基于云原生体系去支撑上方业务的部门。以前有些企业可能是偏竖井式的业务单元,即一个业务单元下面有支撑团队,容器包括 K8S 会让企业内部有更多平台横向型部门的出现。这也是解决复杂度的一个方法,因为不是每个纵向的业务部门都有足够的资源投入和专业度去解决这个问题。企业足够大的时候一定要考虑在 SRE 层和平台建设层形成横向部门,进行职能分离。回到降低成本的主题,互联网企业可以改进的方案其实有不少。从偏底层看,很多云厂商和头部互联网公司在自研芯片,华为的例子告诉我们,进口芯片不仅会导致成本溢出问题,玩不好可能连生命线都被人掐断,所以 软件硬件结合一体才能带来根本上的降本增效。另外就是容器化操作系统,这个领域可以理解为容器技术的一个分支领域,它是基础设施层面一个比较重要的优化方法,降低了操作系统安装、适配以及稳定化的成本。此外,弹性应用技术是很多云厂商广泛使用的降本增效的方法。很多互联网公司在尝试弹性离在线混部技术,本质上还是提高服务器利用率。之前各个公司在自购服务器或者云上购买云服务器的利用率往往都低于 10%。这个利用率并不高,很多公司尝试去推高这个水平线,但推高水平线必然会带来很多技术挑战,比如利用率高了之后,多种负载混合跑时,是否会互相影响。几大厂商都在通过开源或商业化产品形式尝试输出离在线混部(多种负载混合部署)技术,相信在2022年,离在线混部技术将迎来进一步的产品化高峰。结语:降低成本这件事情其实是所有资本家永远的诉求,只不过当公司在起步或者高速发展的阶段这个诉求并没有那么强烈,会以业务先行。但是随着业务进入平稳期或者遇到大环境的限制时,降本需求会更明显,所以2023年,通过异步编程方式以及容器的技术的改进,提升服务器利用率从而降低成本会是大趋势。是的,不计成本一门心思搞科研的时代也许已经过去了,就像某部香港电影里说的那样:没错,我是小角色来的,你可以拒绝我,但是,你可以拒绝这个时代吗?

神工鬼斧惟肖惟妙,M1 mac系统深度学习框架Pytorch的二次元动漫动画风格迁移滤镜AnimeGANv2+Ffmpeg(图片+视频)快速实践

前段时间,业界鼎鼎有名的动漫风格转化滤镜库AnimeGAN发布了最新的v2版本,一时间街谈巷议,风头无两。提起二次元,目前国内用户基数最大的无疑是抖音客户端,其内置的一款动画转换滤镜“变身漫画”,能够让用户在直播中,把自己的实际外貌转换为二次元“画风”。对于二次元粉丝来说,“打破次元壁,变身纸片人”这种自娱自乐方式可谓屡试不爽:但是看多了就难免有些审美疲劳,千人一面的“锥子脸”,一成不变的“卡姿兰”式大眼睛,让人多少有点味同嚼蜡的感觉,未免过犹不及,失之现实。而基于CartoonGan的AnimeGAN动漫风格滤镜则能够在保留原图特点的同时,兼具二次元的炫酷和三次元的写实,颇有些刚柔并济、举重若轻的感觉:并且AnimeGAN项目组业已在线上发布demo接口,可以直接运行模型效果:https://huggingface.co/spaces/akhaliq/AnimeGANv2 但是受限于带宽以及线上资源瓶颈,线上迁移队列经常会处于排队的状态,同时一些原图的上传也可能造成个人隐私的外泄。所以本次我们在M1芯片的Mac os Monterey基于Pytorch深度学习框架,本地搭建AnimeGANV2版本的静态图片以及动态视频的转化服务。我们知道,目前Pytorch的cpu版本在M1芯片mac上的支持版本是Python3.8,在之前的一篇文章中:金玉良缘易配而木石前盟难得|M1 Mac os(Apple Silicon)天生一对Python3开发环境搭建(集成深度学习框架Tensorflow/Pytorch),曾经使用condaforge来构建Pytorch的开发环境,这次我们使用原生的安装包进行安装,首先进入Python官网,下载 Python3.8.10 universal2 稳定版 :https://www.python.org/downloads/release/python-3810/双击安装即可,随后进入终端键入命令安装Pytorch:pip3.8 install torch torchvision torchaudio这里我们默认安装最新的稳定版1.10,随后进入Python3.8命令行,导入torch库:(base) ➜ video git:(main) ✗ python3.8 Python 3.8.10 (v3.8.10:3d8993a744, May 3 2021, 09:09:08) [Clang 12.0.5 (clang-1205.0.22.9)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import torch >>>确定Pytorch可以使用之后,将官方项目克隆下来:git clone https://github.com/bryandlee/animegan2-pytorch.gitAnimeGAN也是基于生成对抗网络(Generative adversarial network),原理就是我们手上有一定量的原图,我们可以称之为三次元图片,真实的图片特征会存在一个分布,比如:正态分布,均匀分布,或者更为复杂的分布形式,那么GAN的目的是通过生成器来生成一批与真实分布接近的数据。这些数据可以理解为二次元的优化,但是会保留三次元的一些特征,比如说眼睛变大、脸型更接近滤镜模型的画风等等,在我们的处理中,这个生成器趋向于使用神经网络,因为它能表示更为复杂的数据分布情况。下载成功之后,可以在weights文件夹下看到四种不同的权重模型,其中celeba\_distill.pt和paprika.pt是用来转化风景图片的,而face\_paint\_512\_v1.pt和face\_paint\_512\_v2.pt则更注重于肖像的转化。首先安装图像处理库Pillow:pip3.8 install Pillow随后新建test\_img.py文件:`from PIL import Image import torch import ssl ssl._create_default_https_context = ssl._create_unverified_context model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="celeba_distill") #model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="face_paint_512_v1") #model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="face_paint_512_v2") #model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="paprika") face2paint = torch.hub.load("bryandlee/animegan2-pytorch:main", "face2paint", size=512) img = Image.open("Arc.jpg").convert("RGB")``out = face2paint(model, img) out.show()`这里以凯旋门的照片为例子,分别使用celeba\_distill和paprika滤镜查看效果,注意本地请求需要关闭ssl证书检测,同时首次运行需要下载线上模型参数:这里图像尺寸参数指的是宽高通道的总数,接下来就是人物肖像动漫风格转化了,调整导入的模型生成器类型,输入图片改成人物肖像:from PIL import Image import torch import ssl ssl._create_default_https_context = ssl._create_unverified_context import numpy as np #model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="celeba_distill") #model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="face_paint_512_v1") model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="face_paint_512_v2") #model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="paprika") face2paint = torch.hub.load("bryandlee/animegan2-pytorch:main", "face2paint", size=512) img = Image.open("11.png").convert("RGB") out = face2paint(model, img) out.show()可以看到,v1滤镜相对风格化更强烈一些,而v2在风格化的基础上相对保留了原图的特征,源于三次元又不拘泥于体验,架空却又不流于虚浮,比抖音的漫画滤镜不知道高到哪里去了。下面我们来看看动态视频的动漫滤镜转换,视频从广义上来讲,就是多张图片的连拍播放,只不过取决于视频帧的速率问题,帧速率也称为FPS(Frames PerSecond)的缩写——帧/秒,是指每秒钟刷新的图片的帧数,也可以理解为图形处理器每秒钟能够刷新几次。 越高的帧速率可以得到更流畅、更逼真的动画,每秒钟帧数(FPS)越多,所显示的动作就会越流畅。这里可以通过第三方软件将连贯的视频转换为以FPS为单位的图片,在m1 mac os系统中,推荐使用著名的视频处理软件:Ffmpeg使用arm架构的Homebrew进行安装:brew install ffmpeg安装成功后,在终端键入ffmpeg命令查看版本:(base) ➜ animegan2-pytorch git:(main) ✗ ffmpeg built with Apple clang version 13.0.0 (clang-1300.0.29.3) configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/4.4.1_3 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-avresample --enable-videotoolbox安装没有问题,随后准备一个视频文件,新建 video\_img.py:import os # 视频转图片 os.system("ffmpeg -i ./视频.mp4 -r 15 -s 1280,720 -ss 00:00:20 -to 00:00:22 ./myvideo/%03d.png")这里我们使用Python3内置的os模块直接运行ffmpeg命令,针对当前目录的视频,以每秒15帧的速率进行转化,-s参数代表视频解析度,-ss参数可以控制视频的开始位置和结束位置,最后是导出图片的目录。运行脚本之后,进入myvideo目录:(base) ➜ animegan2-pytorch git:(main) ✗ cd myvideo (base) ➜ myvideo git:(main) ✗ ls 001.png 004.png 007.png 010.png 013.png 016.png 019.png 022.png 025.png 028.png 002.png 005.png 008.png 011.png 014.png 017.png 020.png 023.png 026.png 029.png 003.png 006.png 009.png 012.png 015.png 018.png 021.png 024.png 027.png 030.png (base) ➜ myvideo git:(main) ✗可以看到,图片按照帧数作为下标文件名已经转换完毕。接着需要利用AnimeGAN滤镜对图片进行批量转换:from PIL import Image import torch import ssl ssl._create_default_https_context = ssl._create_unverified_context import numpy as np import os img_list = os.listdir("./myvideo/") # model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="celeba_distill") # model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="face_paint_512_v1") model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="face_paint_512_v2") # #model = torch.hub.load("bryandlee/animegan2-pytorch:main", "generator", pretrained="paprika") face2paint = torch.hub.load("bryandlee/animegan2-pytorch:main", "face2paint", size=512) for x in img_list: if os.path.splitext(x)[-1] == ".png": print(x) img = Image.open("./myvideo/"+x).convert("RGB") out = face2paint(model, img) out.show() out.save("./myimg/"+x) # exit(-1)每一次转换都将原图保留并且滤镜转化后的图片存放在相对目录myimg里面,随后新建img\_video.py将其重新转换为视频:import os # 图片转视频 os.system("ffmpeg -y -r 15 -i ./myimg/%03d.png -vcodec libx264 ./myvideo/test.mp4")依然是每秒15帧的速率,和原视频相同。如果原视频带有音轨,可以先将音轨进行分离操作:# 抽离音频 import os os.system("ffmpeg -y -i ./lisa.mp4 -ss 00:00:20 -to 00:00:22 -vn -y -acodec copy ./myvideo/3.aac")进行动漫滤镜转换之后,将转换后的视频和原视频的音轨进行合并操作:# 合并音视频 os.system("ffmpeg -y -i ./myvideo/test.mp4 -i ./myvideo/3.aac -vcodec copy -acodec copy ./myvideo/output.mp4")原视频的测试用例:转换后效果:在m1芯片的加持下,基于cpu版本的Pytorch跑起来效率还是不错的,不过令人遗憾的是适配m1芯片的gpu版本的Pytorch我们还需要等待一段时间,在上个月,Pytorch项目组成员soumith给出过这样的回应:So, here's an update. We plan to get the M1 GPU supported. @albanD, @ezyang and a few core-devs have been looking into it. I can't confirm/deny the involvement of any other folks right now. So, what we have so far is that we had a prototype that was just about okay. We took the wrong approach (more graph-matching-ish), and the user-experience wasn't great -- some operations were really fast, some were really slow, there wasn't a smooth experience overall. One had to guess-work which of their workflows would be fast. So, we're completely re-writing it using a new approach, which I think is a lot closer to your good ole PyTorch, but it is going to take some time. I don't think we're going to hit a public alpha in the next ~4 months. We will open up development of this backend as soon as we can.可以看出来,项目组应该是彻底为m1芯片重构Pytorch底层,公开测试版也不会在近期推出,也许明年的下半年会放出来,还是非常值得期待的。结语:无论是清华大学的CartoonGAN,还是基于CartoonGAN的AnimeGANv2,毫无疑问,它们都是业界的翘楚,是顶峰中的顶峰,就算是放在世界人工智能的范围上,摆在PyTorch-GAN这样的项目旁边,也是毫不逊色的,在人工智能领域,AnimeGANv2向世界宣布,中国人只能制造药丸补剂的历史已经一去不复返了。

一网成擒全端涵盖,在不同架构(Intel x86/Apple m1 silicon)不同开发平台(Win10/Win11/Mac/Ubuntu)上安装配置Python3.10开发环境

时光荏苒,过隙白驹,进入2023年,著名敏捷开发语言Python也放出了3.10最终版,本次我们来展示一下在不同的系统和平台中,高效部署Python3.10开发环境,这里我们并不依赖其他的三方软件,只通过Python官方的安装包进行配置,编辑器我们依然使用微软开源的Vscode,争取在一分钟以内就可以在一台干净的开发机上部署好环境,省去一些不必要的步骤。首先我们以市场占有率最高的微软Intel芯片x86架构的64位win10系统为例子。第一步,打开python官网,python.org 选择 python3.10.0 64位 windows系统安装包。下载成功后,双击安装,这里不要选择默认第一个立刻安装,因为默认安装会把python安装到一个非常深的目录,在日常开发中我们有可能去修改一些库的源码,这种目录非常深的情况会造成一些不便。 同时勾选添加环境变量。随后是一些默认安装的插件,不需要单独设置,点击下一步下面这里建议将python安装到c盘根目录,方便我们随时修改和调试安装成功后,可以按快捷键:视窗键+r输入cmd 打开终端 输入python命令,如果可以进入python命令行说明安装成功Python 3.10.0 (tags/v3.10.0:ccb0e6a, Nov 15 2021, 18:08:50) [MSC v.1929 64 bit (AMD64)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>>否则就有可能是环境没有配好,这里我们可以检查一下右键点击此电脑,选择属性,高级系统设置,环境变量,可以看到已经配置好了,这里也可以配置不同版本的Python。除了Python本体,我们还可以配置一下pip的安装源,pip是我们用来装三方库的软件,默认源是国外的网址,速度很慢。这里我们打开用户目录,选择当前用户目录,新建一个文件夹叫做pip,随后新建pip.ini的配置文件:[global] trusted-host = mirrors.aliyun.com index-url = http://mirrors.aliyun.com/pypi/simple文件内部指定信任的域名,然后把具体的源地址写入就可以了。重新打开终端 输入 命令 pip config list 如果看到原地址发生了变化,那么说明配置成功liuyue:Versions liuyue$ pip config list global.index-url='http://mirrors.aliyun.com/pypi/simple/' install.trusted-host='mirrors.aliyun.com' liuyue:Versions liuyue$下面来看看编辑器vscode的配置,打开vscode官网,点击download这里我们选择因特尔芯片64位的系统安装包点击下载安装成功后,双击打开vscode,这里需要安装两个插件。 点击打开插件商店先安装Python插件,它可以帮助我们选择python解释器,并且附带语法检查和代码补全随后可以选择中文语言包,让vscode界面变成中文安装好以后,需要重启编辑器。这里因特儿芯片64位win10系统的开发环境就配置好了。其实win10系统还有另外一个版本,那就是微软适配苹果m1芯片arm架构的win10系统,这个系统也是64位的,整体配置流程上和因特尔芯片的win10没有太大的区别。同样下载 windows 64位安装包,进行上面的安装步骤。唯一需要注意的是,在下载vscode的时候,要选择arm架构的系统安装包,安装步骤没有区别同样可以正常运行python10代码。下面我们来到 windows 11 系统,该系统由于正式版出来的时间并不长,所以微软官方也不建议,将该系统作为生产力工具使用,但是在我的测试过程中,python 3.10 的开发环境也可以正常配置,开发过程中并没有发现什么bug。具体配置流程: 还是下载 windons 64位安装包,进行之前的安装步骤 正常配置pip源 下载vscode的时候,根据系统芯片的区别对应选择vscode版本即可,安装和配置插件和win10系统并无二致。 最后也可以正常运行python代码,所以win 11 平台的向下兼容做的还是非常不错的。接着我们来到mac系统,mac系统也分两大类型,首先我们来看因特尔芯片的mac。流程还是官网下载安装,这里如果是 3.10版本可以直接下载64位mac安装包,如果是老版本的话,比如3.9,建议下载因特儿芯片的专用的安装包。 下载成功后,点击安装,不需要特殊配置,系统会自动把python安装到应用程序中。默认安装目录:/Library/Frameworks/Python.framework/Versions/3.10/我们点开终端,直接输入python3命令即可,同时pip也会自动配置好,每个版本号都有自己单独的命令,切换起来也非常方便。liuyue:Versions liuyue$ python3 Python 3.10.0 (v3.9.9:ccb0e6a345, Nov 15 2021, 13:29:20) [Clang 6.0 (clang-600.0.57)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>>同样的,mac系统也可以更改pip安装源,在用户目录下,创建.pip文件夹,写入pip.conf 文件。vim ~/.pip/pip.conf文件内容和win10平台一致,也是指定阿里云国内源。[global] index-url = http://mirrors.aliyun.com/pypi/simple/ [install] trusted-host = mirrors.aliyun.com随后输入 pip3 config list 查看是否生效。liuyue:Versions liuyue$ pip3 config list global.index-url='http://mirrors.aliyun.com/pypi/simple/' install.trusted-host='mirrors.aliyun.com' liuyue:Versions liuyue$当然了,如果通过安装包安装了不同版本的Python,此时需要指定一个默认版本,也可以通过软链接的形式进行配置:sudo ln -s /Library/Frameworks/Python.framework/Versions/3.10/bin/python3.10 /usr/local/bin/python3同理,不同版本的pip也可以设置软链接:sudo ln -s /Library/Frameworks/Python.framework/Versions/3.10/bin/pip3.10 /usr/local/bin/pipmac系统下vscode配置,如果是因特尔芯片系统,那么可以选择因特尔专用的版本,下载成功后是一个压缩包,解压缩后,把文件直接拖动到应用程序里即可。插件方面,和win10系统一样,只需要安装Python插件就可以正常使用了。接着我们来看一下 苹果m1芯片 arm架构的mac 系统。这里我们以最新的苹果 monterey系统为例子:在安装文件的版本选择上,无论是3.10最新版,还是老版本,一律选择arm架构专用的安装包 随后双击安装,安装流程上没有任何区别,同样可以配置pip源。 编辑器层面,也是选择arm架构的版本进行下载。 可以说m1 芯片mac系统在配置上除了安装文件一律选择arm架构,其他流程和intel 芯片的mac系统并无二致。最后来看看ubuntu系统,这里我们以百度云的ubuntu 20.04的版本为例子。ubuntu也是第一个支持通过软件管理器直接安装python3.10的unix内核系统,其他系统比如说centos还需要进行编译安装,这里我们用apt-get来安装python。登录系统后,首先将安装源添加到apt-get,添加 deadsnakes PPA 到源列表。add-apt-repository ppa:deadsnakes/ppa升级apt-get。apt update随后安装 python 3.10apt install python3.10安装成功后,就可以直接进入python命令行了。root@instance-fxsra23d:~# python3.10 Python 3.10.0 (default, Oct 4 2021, 22:09:55) [GCC 9.3.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>>但是这里3.10并不是唯一版本,我们可以看到系统默认的版本是3.8。root@instance-fxsra23d:~# python3 Python 3.8.10 (default, Sep 28 2021, 16:10:42) [GCC 9.3.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>>随后可以使用命令将3.10设置为第一顺位的默认版本。sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.10 1之后默认版本就已经切换为python 3.10root@instance-fxsra23d:~# python3 Python 3.10.0 (default, Oct 4 2021, 22:09:55) [GCC 9.3.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>>结语:藉此,我们分别在两个不同芯片架构上的五个不一样的操作系统展示了如何配置python3.10开发环境,诚然,python 3.10 新版本固然不错,但是现有项目能否在不作大面积修改的情况下仍然可以正常运行需要打一个问号,是的,版本迭代的理想性和语言升级实践的现实性之间,总是存在相当的差距,从而使得升级本身造成很大的阻力,但很多时候,为了长期利益,短期的阵痛则是必须的。

基于NOSTR协议的“公有制”版本的Twitter,去中心化社交软件Damus用后感,一个极端走向另一个极端

最近,一个幽灵,Web3的幽灵,在网络游荡,它叫Damus,这玩意诠释了什么叫做病毒式营销,滑稽的是,一个Web3产品却在Web2的产品链上疯狂传销,各方大佬纷纷为其背书,到底发生了什么?Damus的葫芦里,卖的是什么药?注册和简单实用很少有什么产品在用户注册环节会有什么噱头,但Damus确实出其不意,它抛开了传统的Web3产品“区块链钱包先行”的策略,直接一键式生成秘钥对,没有了任何门槛,即使是对Web3完全没有任何概念的普通人,也可以直接上手使用,这里我们使用Damus的网页版,直接访问 https://snort.social/login: 点击页面中的Generate Key按钮即可注册。 注册成功后,进入settings,选择profile,账户设置页面。 在这里我们可以像传统的web2.0社交产品一样,填写昵称,上传头像或者banner,以及其他的一些个人资料,总体上乏善可陈,设置完毕后是这样的: 值得一提的是,这里用户的唯一标识是一串公钥(Public Key)地址:npub16mu2qn54ehx3eh04jy5naq72xkhx3wz6shmkmlr35cpjccgyy5ksvm0plu Damus的用户可以根据公钥地址来选择关注其他用户,也可以进行“发帖”操作,发布的“帖子”会被关注者们看到,玩惯了Twitter的用户对这些都不陌生。 在个人设置页面中:https://snort.social/settings/profile ,有一个很关键的私钥(Private Key),这个东西是唯一能够证明“你是你自己”的凭证,有点像区块链钱包Metamask中的助记词,登录Damus的时候,可以选择使用私钥进行登录。NOSTR协议Damus底层基于NOSTR协议,那么什么是NOSTR协议?其实和我们熟识的HTTP协议也差不了太多,也分为两个端,只不过HTTP协议是客户端和服务端,而NOSTR协议则没有服务端,取而代之的是中继端(relay)。 说白了,没有了中心化的服务器端,变成点对点的中继器,这个中继器可以理解为“共产化”的服务器,每个人都可以搭建并且传输数据,如此就形成了一个完全去中心化的社交网络。 好处就是用户不再受中心化服务器的制约,只要中继器存在,就可以发布想要发布的所有信息。 在后台我们也可以自由的设置和添加NOSTR协议的中继器,甚至可以修改读写权限: 数据交换形式则采用websocket + JSON 的方式: 具体的交互数据包括当前用户的档案信息,比如公钥地址、用户头像,用户简介等等,用户发送的信息内容,也就是帖子内容,最后,是用户推送给关注者的中继器地址,例如上文中的wss://relay.snort.social。一个极端走向另一个极端NOSTR协议赋予了Damus网络用户极致的“自由”,可是“自由”也是需要付出代价的,那就是负面有害信息的肆意传播和增长,由于任何人都可以运行一个或多个中继器,所以,就很难有人能控制所有的中继器,也就没法针对某些散发有害信息的公钥地址进行限制,这就意味着,没有了任何所谓的“规则”,变成了彻头彻尾的“黑暗森林法则”。不得不承认,去中心化带来的并非都是美好的事物。它同样刺激了信息操纵和误导我们的判断,从而给去中心化网络带来了诸多问题。这些问题就像顽疾一样,让人们痛苦却无可奈何。我们往往并不清楚自己真正需要的是什么,而这个弱点常常会被利益集团抓住,并加以充分利用。这就是他们的欺骗行为。欺骗行为的不可避免性未必源于那些品质恶劣的人,而是很可能源于去中心化的自然运作。 举个例子,20世纪40、50年代,逐步有医学证据表明吸烟和肺癌之间的关系。但是由烟草公司资助的研究,指出吸烟与癌症之间的关系还没有被证实。哥伦比亚广播公司的电视节目里安排的论战,显得“抽烟导致癌症”和“抽烟不会导致癌症”的证据难分胜负。但美国1964年卫生总署发布报告,明确指出,抽烟有害健康。该报告代表了美国政府的官方立场。1973年,在公共场所吸烟被禁止。为了应对烟草行业联盟“抽烟很潇洒”的诱惑,反烟草运动持续传播“抽烟很愚蠢”这一信念。1964年发布的卫生总署报告在这方面居功至伟,这,就是监管的力量。结语成也去中心化,败也去中心化,在去中心化网络中,拿着钓竿坐等鱼上钩的“姜太公”无处不在,根据简单的概率原理,就算我们谨小慎微、如履薄冰,最终,迟早都会被人“钓”到,没有人能够幸免。

举重若轻流水行云,前端纯CSS3实现质感非凡的图片Logo鼠标悬停(hover)光泽一闪而过的光影特效

喜欢看电影的朋友肯定会注意到一个有趣的细节,就是电影出品方一定会在片头的Logo环节做一个小特效:暗影流动之间光泽一闪而过,这样做不仅可以提高Logo的辨识度,还可以提升质感,一举两得。参照华纳兄弟影业(Warner Bros. Pictures)的例子:那么,在前端领域,如果使用纯CSS技术,能不能实现类似的特效呢?答案当然是可以的,这次我们以本站的Logo为例子,以一持万、提纲挈领地讲解一下如何使用纯CSS技术实现图片Logo鼠标悬停光泽一闪而过的光影特效。一般情况下,大多数前端开发会选择 linear-gradient() ,这个方法创建一个表示两种或多种颜色线性渐变的图片。其结果属于数据类型,是一种特别的数据类型。简单用法:/* 渐变轴为45度,从蓝色渐变到红色 */ linear-gradient(45deg, blue, red); /* 从右下到左上、从蓝色渐变到红色 */ linear-gradient(to left top, blue, red); /* 从下到上,从蓝色开始渐变、到高度40%位置是绿色渐变开始、最后以红色结束 */ linear-gradient(0deg, blue, green 40%, red);那么它怎么和logo图片结合使用呢?首先创建一个对象,因为是logo,所以我使用a标签,也就是超级链接,随后声明伪类mylogo:<a href="/" class="mylogo" title="刘悦的技术博客"></a>之后,定义logo的样式:.mylogo{ display:block; margin: 0 auto; width:255px; height:200px; background-image:/logo.png; background-repeat: no-repeat; }接着就是linear-gradient()出场,原理并不复杂,利用linear-gradient绘制一个白色半透明渐变层,利用背景的负坐标隐藏起来,同时配合transition属性,在鼠标悬停(hover)的时候,设置1秒钟的延时动画,逐渐将光斑的坐标进行位移,产生一种光泽掠过的效果:.mylogo{ width: 255px; height: 200px; background: -webkit-linear-gradient(left, rgba(255,255,255,0)0, rgba(255,255,255,0.5)50%, rgba(255,255,255,0)100%) no-repeat -270px 0, url(/logo.png) no-repeat; transition: 1s ease; .mylogo:hover { background-position: 200px 0, 0 0; }这里需要注意的是,默认负坐标一定要超过logo本体的宽度,否则位移就不够充分,效果是下面这样的:看起来还不错,这里transition的属性设置在logo本体的伪类上面,此时如果logo本体失去鼠标的焦点,光斑位置又会回到原来的负坐标,此时光影又会在回闪一次,也就是一次悬停发生两次位移,闪烁两次,如果只想闪一次,可以将transition加载hover伪类中,这样离开后不会二次位移,因为动画效果只会出现在鼠标悬停上,鼠标离开后,就没有动画回闪了:.mylogo{ width: 255px; height: 200px; /*直接使用background缩放版本*/ /*每个渐变点的位置不能太小,不然会出现残缺光斑*/ /*no-repeat -270px 0:将光斑定位隐藏起来*/ background: -webkit-linear-gradient(left, rgba(255,255,255,0)0, rgba(255,255,255,0.5)50%, rgba(255,255,255,0)100%) no-repeat -270px 0, url(/logo.png) no-repeat; /* transition: 1s ease; */ .mylogo:hover{ /*鼠标滑过实现光斑滑动,但是在多背景情况下,需要多个background-position属性值,否则会影响其他背景*/ background-position: 200px 0, 0 0; transition: 1s ease; }效果是这样的:但是这就结束了吗?还没有,因为这看起来似乎。。。有点一律千篇?如果所有人都用linear-gradient,就难免有点无趣了,那么有没有别的不落窠臼的玩儿法呢?既然晓得了原理,无非就是位移产生的小把戏,那么我们完全脱离linear-gradient,使用一张带光泽质感的背景图片shine.png:由于使用了背景图,所以我们需要对代码进行修改,为实体的背景图添加一个容器,span标签:<a href="/" class="mylogo" title="刘悦的技术博客"><span></span></a>样式和linear-gradient差不多,也是利用负坐标将span标签内的背景图隐藏起来:.mylogo span { display: block; background: url("/shine.png") -360px -380px no-repeat; transition-property: all; transition-duration: .7s; height: 200px; width: 255px; }接下来要比linear-gradient要简单地多,直接设置悬停属性,让背景图片发生位移:.mylogo:hover span { background-position: 100px 300px; }效果是这样的:如果仔细观察,会发现背景图更加契合光影掠过的效果,因为linear-gradient每个渐变点在不同分辨率的屏幕下并不统一,也就是说在高分辨下会出现残缺光斑。暗黑模式下的效果是这样的:看起来似乎更加有质感一点,除此之外,也许你还想利用transition玩一些更加刺激的效果:.mylogo:hover { -webkit-transform: rotate(666turn); transform: rotate(666turn); transition-delay: 1s; transition-property: all; transition-duration: 59s; transition-timing-function: cubic-bezier(.34, 0, .84, 1) }让我们旋转、跳跃、闭着眼:结语:两套方案都可以很好的实现光影特效,区别在于linear-gradient并不会消耗网站的带宽,但会消耗电脑的CPU和内存,而与背景渐变相比,背景图像效果会更好一点,但是将会更多地使用网络带宽,而webp技术又可以帮助我们对图片进行极致的压缩(参见:https://v3u.cn/a\_id\_190),所以我们可以理解这是一种权衡,毕竟,书本上写的是道理,但是现实中讲究的是取舍,不是吗?

Win10系统下基于Docker构建Appium容器连接Android模拟器Genymotion完成移动端Python自动化测试

Python自动化,大概也许或者是今年最具热度的话题之一了。七月流火,招聘市场上对于Python自动化的追捧热度仍未消减,那么Python自动化到底能帮我们做些什么呢?第一,Python自动化可以避免熟练工种的重复工作,对于功能相对完整和成熟的软件,每发布一个新的版本,无论是大版本还是小版本,其中大部分功能和界面都几乎和上一个版本相似或完全相同,但所谓向上兼容,你不能因为新功能的产生而不对老版本功能进行测试工作,而这些老功能又在上一个版本上线时测过,所以这部分功能特别适合于自动化测试,从而可以让测试达到测试每个特征的目的。第二,Python自动化可以帮助我们提高测试效率:比如一个项目要的开发周期只有短短的几个月,而在测试期间是每周都要发布一个版本供测试人员测试,一个系统的功能点有几千个上万个,人工测试是非常的耗时和繁琐,这样必然会使测试效率低下,而自动化流程恰恰帮我们提高了测试效率。那么对于移动App测试领域,如果一个新的应用发布版本,QA人员面临的挑战就是如何应对市场上数以千计的机型兼容性测试,毫无疑问,这是一个非常浩大的工程,更别提有些工程机在市面上根本就采购不到,比如谷歌的Nexus和Pixel系列手机,所以本次我们就在Win10系统下尝试利用Genymotion模拟器配合Docker构建Appium容器,实现短时间内上千款机型的自动化测试工作。首先关于Android模拟器为什么选择Genymotion,诚然,国内也有蓝神等模拟器可供选择,但是Genymotion作为Android模拟器领域的执牛耳者,其启动及运行速度非常快,不仅支持多个Android版本,还可以多个Android系统同时启动运行,这就为我们并行测试脚本提供了便利,当然了,Genymotion也有自身的缺点,就是客户端版本为内核x86架构暂不支持arm框架的应用,但是云端服务已经支持了arm架构,这无疑在模拟器领域是一个重大利好。进入Genymotion注册页面:https://www-v1.genymotion.com/account/create/注册成功后,注意邮箱需要激活一下,随后进入下载页面:https://www.genymotion.com/download/这里Win10系统会有两个版本,因为Genymotion内核是基于VirtualBox虚拟机,所以如果未安装VirtualBox,则选择with Virtualbox,否则可以选择without Virtualbox,直接选择Genymotion本体即可。安装成功后,用刚刚注册的账号进行登录,随后选择personal use(个人版):随后选择需要测试的机型创建即可:创建好对应手机的虚拟机,还需要进行一些设置,才能保证虚拟机正常运行。将Virtualbox设置常规选项中的版本重新选择Ohter Linux-64位同时将网络选项的混杂模式选择:允许虚拟电脑,这个稍后链接虚拟机的时候会用到:最后,为了安全起见,最好将宿主机的hyper-v功能关闭,管理员权限打开终端,执行命令bcdedit /set hypervisorlaunchtype off重启电脑后,启动手机模拟器,出现Android界面则表示配置成功:接着我们来配置Android ADB,Android ADB又是什么?ADB 全称是 Android Debug Bridge,是开发或使用 Android 时很常用到的工具。可以从电脑透过 USB 连线到 Android 手机上,利用指令列来控制你的手机。这里我们主要是通过ADB命令来获取虚拟机的终端ip,直接下载压缩包文件:https://dl.google.com/android/repository/platform-tools-latest-windows.zip将其解压到C盘根目录,C:\platform-tools\_r31.0.2-windows\platform-tools然后将该目录配置全局环境变量,使其可以在终端内直接访问:C:\Users\liuyue>adb --version Android Debug Bridge version 1.0.41 Version 31.0.2-7242960 Installed as C:\platform-tools_r31.0.2-windows\platform-tools\adb.exe现在执行设备列表命令:C:\Users\liuyue>adb devices List of devices attached 192.168.42.103:5555 device可以看到,刚刚我们启动的虚拟机已经出现在设备列表中了,直接通过connect命令就可以进行连接,和真机几乎没有任何差别:C:\Users\liuyue>adb devices List of devices attached 192.168.42.103:5555 device C:\Users\liuyue>adb connect 192.168.42.103:5555 already connected to 192.168.42.103:5555 C:\Users\liuyue>下面轮到Docker出场了,Docker的任务主要是利用容器运行Appuim自动化脚本,这样就避免了繁缛的Appuim安装配置环节,当然了,您的电脑得提前装好Docker,如果没有,请移步:win10系统下把玩折腾DockerToolBox以及更换国内镜像源(各种神坑)随后下载Appium基础镜像,Dockerhub上的镜像鱼龙混杂,这里还是推荐官方的版本:https://hub.docker.com/r/appium/appium执行命令:docker pull appium/appium查看镜像:liuyue@DESKTOP-NVU6CCV MINGW32 ~ $ docker images REPOSITORY TAG IMAGE ID CREATED SIZE appium/appium latest 70f3d328b949 6 weeks ago 1.55GB紧接着启动容器:docker run --privileged -d -p 4723:4723 --name appium appium/appium这里我们启动Appium容器,端口映射到4723,privileged参数让其具备root权限,-d后台执行。随后查看容器运行状态:liuyue@DESKTOP-NVU6CCV MINGW32 ~ $ docker run --privileged -d -p 4723:4723 --name appium appium/appium a2e8f11fdf7c561b075b563dfcc1efb6e5381e78dc3d4435a89cf8f97be52f6d liuyue@DESKTOP-NVU6CCV MINGW32 ~ $ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES a2e8f11fdf7c appium/appium "/bin/sh -c '/root/w…" 7 minutes ago Up 19 seconds 4567/tcp, 0.0.0.0:4723->4723/tcp appium此时,我们就可以利用docker容器内的adb服务进行操作了:docker exec -it appium adb connect 192.168.42.103:5555可以看到,docker连接手机配对成功:liuyue@DESKTOP-NVU6CCV MINGW32 ~ $ docker exec -it appium adb connect 192.168.42.103:5555 connected to 192.168.42.103:5555常用的adb命令都可以进行操作,例如查看手机Android版本:liuyue@DESKTOP-NVU6CCV MINGW32 ~ $ docker exec -it appium adb shell getprop ro.build.version.release 5.0下面我们来编写一套简单的Appium自动化测试脚本,首先安装Appium库:pip install Appium-Python-Client编写appium\_test.py:from appium import webdriver cap = { "platformName": "Android", "platformVersion": "5", "deviceName": "192.168.42.103:5555", "udid":"192.168.42.103:5555", # 真机的 # "platformName": "Android", # "platformVersion": "7.1.2", # "deviceName": "10d4e4387d74", "noReset": True, "unicodeKeyboard": True, "resetkeyboard": True driver = webdriver.Remote('https://192.168.99.100:4723/wd/hub', cap) # 安装APP driver.install_app(app_path='C:\\test.apk', replace=False, # 不允许覆盖 timeout=10000, # 超时时间为10秒 allowTestPackages=True, # 允许测试包 useSdcard=False, # 不要安装在Sdcard grantPermissions=False) # 授予权限 driver.quit()这里的192.168.42.103:5555是Genymotion模拟的手机客户端地址,而https://192.168.99.100:4723/wd/hub则是基于Docker的Appium容器,这里我们为手机安装一款测试的app。安装操作脚本执行以后,可以判断是否安装成功:from appium import webdriver cap = { "platformName": "Android", "platformVersion": "5", "deviceName": "192.168.42.103:5555", "udid":"192.168.42.103:5555", # 真机的 # "platformName": "Android", # "platformVersion": "7.1.2", # "deviceName": "10d4e4387d74", "noReset": True, "unicodeKeyboard": True, "resetkeyboard": True driver = webdriver.Remote('https://192.168.99.100:4723/wd/hub', cap) # 判断APP是否安装,传递的参数为包名 res = driver.is_app_installed('com.tencent.android.qqdownloader') print(res) driver.quit()也可以利用脚本启动一些app,比如内置的计算器应用:from appium import webdriver from time import sleep cap = { "platformName": "Android", "platformVersion": "5", "deviceName": "192.168.42.103:5555", "udid":"192.168.42.103:5555", # 真机的 # "platformName": "Android", # "platformVersion": "7.1.2", # "deviceName": "10d4e4387d74", "appPackage": "com.android.calculator2", "appActivity": "com.android.calculator2.Calculato", "noReset": True, "unicodeKeyboard": True, "resetkeyboard": True driver = webdriver.Remote('https://192.168.99.100:4723/wd/hub', cap) # 等待3秒 sleep(3) # 将APP置于后台运行5秒钟,然后再切回前台 driver.background_app(5) # 关闭APP driver.close_app() sleep(3) # 重新启动APP driver.launch_app() sleep(3) driver.quit()这里计算器的包名和activity信息都配置在cap变量中。杀死应用进程:from appium import webdriver cap = { "platformName": "Android", "platformVersion": "5", "deviceName": "192.168.42.103:5555", "udid":"192.168.42.103:5555", # 真机的 # "platformName": "Android", # "platformVersion": "7.1.2", # "deviceName": "10d4e4387d74", "appPackage": "com.android.calculator2", "appActivity": "com.android.calculator2.Calculato", "noReset": True, "unicodeKeyboard": True, "resetkeyboard": True driver = webdriver.Remote('https://192.168.99.100:4723/wd/hub', cap) # 等待3秒 sleep(3) # 如果应用程序没有运行或正在后台运行,则激活该应用程序 driver.activate_app('com.android.calculator2') sleep(3) # 终止应用程序 driver.terminate_app('com.android.calculator2') sleep(3) driver.quit()几乎所有的移动端应用操作都可以编写Python脚本进行自动化测试,我们可以将Appium理解为移动端的Selenium,使用起来非常方便。结语:莎士比亚说过,“一千个观众眼中有一千个哈姆雷特”。而在千万个Genymotion模拟器中,移动App也可以是千万种样子,通过编写Appium自动化脚本就可以将测试人员将这千万种的重复测试劳动中解放出来,何乐而不为呢?

浩若烟海事半功倍|利用Docker容器技术构建自动化分布式web测试集群Selenium Grid

“世界上有那么多城市,城市里有那么多的酒馆,可她,却偏偏走进了我的.....”,这是电影《卡萨布拉卡》中的一句著名独白,投射到现实生活中,与之类似的情况不胜枚举,这世界上有那么多的系统,系统中有那么多的浏览器,在只有一台测试机的前提下,难道我们只能排队一个一个地做兼容性测试吗?有没有效率更高的方法呢?为此我们提出一个更高效的解决方案:使用Docker+Selenium Grid。Selenium Grid是一个分布式WebUI测试工具,可以将测试流程分发到多台服务器上,并行地执行。Selenium Grid架构中包含两个主要角色:Hub是中心点控制节点,而Node是Selenium的工作节点,它们注册到Hub上,并会操作浏览器执行由Hub下发的自动测试用例。也就是利用一个调度中心,分别在不同机器上安装不同的操作系统,系统中再安装对应需要测试的浏览器,但是,以传统的方式部署分布式Selenium Grid集群有一定的技术难度。而且一个浏览器在操作系统上只能安装一个版本且只能有一个运行实例。比如为了针对不同版本的Chrome进行测试,需要将指定版本的Chrome浏览器安装到不同物理机或虚拟机上,这样要耗费大量时间和机器成本来准备测试环境。怎么简化Selenium Grid集群安装过程中的复杂性呢?答案是Docker,是的,Docker,又见Docker,Docker可以在单台服务器上利用容器技术直接部署多个节点,过程简单方便,只需要编写Dockerfile脚本即可,大大提升了测试效率,本次我们就使用Docker+Selenium Grid来实现多系统多版本浏览器并发式兼容性测试。首先,安装Docker,请移步:win10系统下把玩折腾DockerToolBox以及更换国内镜像源(各种神坑)随后,拉取Selenium Grid调度中心的镜像文件:docker pull selenium/hub这里我们测试两款不同的浏览器兼容性:Chrome、FireFox所以分别拉取镜像文件:docker pull selenium/node-chrome docker pull selenium/node-firefox全部三个镜像下载成功后,输入命令:docker images查看本地镜像:liuyue:mytornado liuyue$ docker images REPOSITORY TAG IMAGE ID CREATED SIZE selenium/node-chrome latest 0843e55de3dc 2 weeks ago 1.04GB selenium/hub latest 705be32777f0 2 weeks ago 283MB selenium/node-firefox latest f794497d8393 2 months ago 956MB检查没有问题后,我们来编写Docker-compose的配置文件,Docker-compose是最基本的容器编排工具,它可以快速统筹多个镜像的协同使用,编写docker-compose.yml:version: "3" services: image: selenium/hub ports: - "4444:4444" environment: GRID_MAX_SESSION: 16 GRID_BROWSER_TIMEOUT: 3000 GRID_TIMEOUT: 3000 chrome: image: selenium/node-chrome container_name: chrome depends_on: - hub environment: HUB_PORT_4444_TCP_ADDR: hub HUB_PORT_4444_TCP_PORT: 4444 NODE_MAX_SESSION: 4 NODE_MAX_INSTANCES: 4 volumes: - /dev/shm:/dev/shm ports: - "9001:5900" links: - hub firefox: image: selenium/node-firefox container_name: firefox depends_on: - hub environment: HUB_PORT_4444_TCP_ADDR: hub HUB_PORT_4444_TCP_PORT: 4444 NODE_MAX_SESSION: 2 NODE_MAX_INSTANCES: 2 volumes: - /dev/shm:/dev/shm ports: - "9002:5900" links: - hub配置文件的主要内容就是将Selenium Grid的容器服务hub部署在4444端口上,并且通过端口映射,让宿主机可以访问,使用镜像就是我们刚刚下载好的selenium/hub镜像,而火狐(firefox)和谷歌(chrome)这两款浏览器分别依赖于hub服务,NODE\_MAX\_INSTANCES定义了可以运行多少个浏览器实例。此时,我们在docker-compose.yml所在的目录执行命令,来启动服务:docker-compose -f docker-compose.yml up -d-d 参数意味着在后台运行,当然了您也可以选择在前台运行。随后访问浏览器 http://localhost:4444/grid/console ,这里请求的ip是宿主机本地的,但其实是通过端口映射访问docker容器内的Selenium Grid调度中心:可以看到,两款浏览器的服务都已经正常启动,分别运行四个和两个实例,同时也可以在终端运行Docker命令来查看进程:docker ps返回容器列表:liuyue:mytornado liuyue$ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES adcd4683f39c selenium/node-firefox "/opt/bin/entry_poin…" 2 days ago Up 2 days 0.0.0.0:9002->5900/tcp firefox 58dfe5825439 selenium/node-chrome "/opt/bin/entry_poin…" 2 days ago Up 2 days 0.0.0.0:9001->5900/tcp chrome 97d602944b34 selenium/hub "/opt/bin/entry_poin…" 2 days ago Up 2 days 0.0.0.0:4444->4444/tcp mytornado_hub_1浏览器准备好了,接下来的事情就简单了,让我们来用Docker容器实际测试一下,编写test.py:import time from selenium import webdriver from selenium.webdriver.common.desired_capabilities import DesiredCapabilities #指定运行主机与端口号 driver = webdriver.Remote(command_executor='http://127.0.0.1:4444/wd/hub', desired_capabilities=DesiredCapabilities.CHROME) driver.get("https://v3u.cn") time.sleep(1) driver.get_screenshot_as_file("v3u.png") driver.quit()这里使用chrome浏览器驱动使用远程模式(Remote),访问宿主机本地ip,端口4444,打开本站之后,截图查看是否有布局错误问题。查看截图:再来试试火狐浏览器(firefox):import time from selenium import webdriver from selenium.webdriver.common.desired_capabilities import DesiredCapabilities #指定运行主机与端口号 driver = webdriver.Remote(command_executor='http://127.0.0.1:4444/wd/hub', desired_capabilities=DesiredCapabilities.FIREFOX) driver.get("https://v3u.cn") time.sleep(1) driver.get_screenshot_as_file("v3u_frefox.png") driver.quit()查看firefox下的测试截图:差别不大,但是可以通过实际测试看出细节的差异,比如字体和超链接颜色的不同,这些都是兼容性测试中的常备部分。诚然,我们完全可以将代码写得更加规范一些,毕竟,这是在做兼容性测试,谁也不想在测试工作中出现任何的纰漏,这里使用Python内置的单元测试库unittest将之前的代码重构一下:`import os import datetime import time import unittest from selenium import webdriver from selenium.webdriver.common.desired_capabilities import DesiredCapabilities class Example(unittest.TestCase): def setUp(self): self.driver = webdriver.Remote( command_executor='http://127.0.0.1:4444/wd/hub', desired_capabilities=DesiredCapabilities.CHROME)` `self.driver.get("https://v3u.cn") def test_firefox(self): time.sleep(1) self.driver.get_screenshot_as_file("v3u_chrome.png") def tearDown(self): self.driver.quit() if __name__ == "__main__": unittest.main(verbosity=1)`测试结果:liuyue:pickupname liuyue$ python3 "/Users/liuyue/Downloads/ccpt_21_cvm/鼠标/pickupname/test.py" /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/selenium/webdriver/remote/remote_connection.py:374: ResourceWarning: unclosed <socket.socket fd=5, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('127.0.0.1', 63563), raddr=('127.0.0.1', 4444)> return self._request(command_info[0], url, body=data) ResourceWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/selenium/webdriver/remote/remote_connection.py:374: ResourceWarning: unclosed <socket.socket fd=5, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('127.0.0.1', 63566), raddr=('127.0.0.1', 4444)> return self._request(command_info[0], url, body=data) ResourceWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/selenium/webdriver/remote/remote_connection.py:374: ResourceWarning: unclosed <socket.socket fd=5, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('127.0.0.1', 63573), raddr=('127.0.0.1', 4444)> return self._request(command_info[0], url, body=data) ResourceWarning: Enable tracemalloc to get the object allocation traceback /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/selenium/webdriver/remote/remote_connection.py:374: ResourceWarning: unclosed <socket.socket fd=5, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('127.0.0.1', 63574), raddr=('127.0.0.1', 4444)> return self._request(command_info[0], url, body=data) ResourceWarning: Enable tracemalloc to get the object allocation traceback ---------------------------------------------------------------------- Ran 1 test in 5.908s OK测试完毕后,可以通过Docker-compose命令一键停止容器服务,非常方便:docker-compose -f docker-compose.yml down尤其是容器数量非常多的情况下,我们不需要手动一个一个来停止服务:liuyue:mytornado liuyue$ docker-compose -f docker-compose.yml down Stopping firefox ... done Stopping chrome ... done Stopping mytornado_hub_1 ... done Removing firefox ... done Removing chrome ... done Removing mytornado_hub_1 ... done Removing network mytornado_default liuyue:mytornado liuyue$再次查看服务进程:liuyue:mytornado liuyue$ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES liuyue:mytornado liuyue$结语:本次,我们介绍了分布式自动化Web测试软件Selenium Grid的设置、服务的运行、以及停止,没有任何问题。通过使用这种自动化测试方法,我们可以节省大量时间,并以高效的方式获得最准确的测试结果。如果您现有测试机的配置更加优秀,还可以进一步探索,尽可能多的开启浏览器实例,以此做到海量并发兼容性测试。

声音好听,颜值能打,基于PaddleGAN给人工智能AI语音模型配上动态画面(Python3.10)
PaddlePaddle是百度开源的深度学习框架,其功能包罗万象,总计覆盖文本、图像、视频三大领域40个模型,可谓是在深度学习领域无所不窥。 PaddleGAN视觉效果模型中一个子模块Wav2lip是对开源库Wav2lip的二次封装和优化,它实现了人物口型与输入的歌词语音同步,说白了就是能让静态图的唇部动起来,让人物看起来仿佛正在唱歌。 除此以外,Wav2lip还可以直接将动态的视频,进行唇形替换,输出与目标语音相匹配的视频,如此一来,我们就可以通过AI直接定制属于自己的口播形象了。
云端炼丹,算力白嫖,基于云端GPU(Colab)使用So-vits库制作AI特朗普演唱《国际歌》
人工智能AI技术早已深入到人们生活的每一个角落,君不见AI孙燕姿的歌声此起彼伏,不绝于耳,但并不是每个人都拥有一块N卡,没有GPU的日子总是不好过的,但是没关系,山人有妙计,本次我们基于Google的Colab免费云端服务器来搭建深度学习环境,制作AI特朗普,让他高唱《国际歌》。 Colab(全名Colaboratory ),它是Google公司的一款基于云端的基础免费服务器产品,可以在B端,也就是浏览器里面编写和执行Python代码,非常方便,贴心的是,Colab可以给用户分配免费的GPU进行使用,对于没有N卡的朋友来说,这已经远远超出了业界良心的范畴,简直就是在做慈善事业。
AI天后,在线飙歌,人工智能AI孙燕姿模型应用实践,复刻《遥远的歌》,原唱晴子(Python3.10)
忽如一夜春风来,亚洲天后孙燕姿独特而柔美的音色再度响彻华语乐坛,只不过这一次,不是因为她出了新专辑,而是人工智能AI技术对于孙燕姿音色的完美复刻,以大江灌浪之势对华语歌坛诸多经典作品进行了翻唱,还原度令人咋舌,如何做到的? 本次我们借助基于Python3.10的开源库so-vits-svc,让亚洲天后孙燕姿帮我们免费演唱喜欢的歌曲,实现点歌自由。
极速进化,光速转录,C++版本人工智能实时语音转文字(字幕/语音识别)Whisper.cpp实践
业界良心OpenAI开源的[Whisper模型](https://v3u.cn/a_id_272)是开源语音转文字领域的执牛耳者,白璧微瑕之处在于无法通过苹果M芯片优化转录效率,Whisper.cpp 则是 Whisper 模型的 C/C++ 移植版本,它具有无依赖项、内存使用量低等特点,重要的是增加了 Core ML 支持,完美适配苹果M系列芯片。 Whisper.cpp的张量运算符针对苹果M芯片的 CPU 进行了大量优化,根据计算大小,使用 Arm Neon SIMD instrisics 或 CBLAS Accelerate 框架例程,后者对于更大的尺寸特别有效,因为 Accele
任务拆解,悠然自得,自动版本的ChatGPT,AutoGPT自动人工智能AI任务实践(Python3.10)
当我们使用ChatGPT完成某些工作的时候,往往需要多轮对话,比如让ChatGPT分析、翻译、总结一篇网上的文章或者文档,再将总结的结果以文本的形式存储在本地。过程中免不了要和ChatGPT“折冲樽俎”一番,事实上,这个“交涉”的过程也可以自动化,AutoGPT可以帮助我们自动拆解任务,没错,程序能做到的事情,人类绝不亲力亲为。 我们唯一需要做的,就是告诉AutoGPT一个任务目标,AutoGPT会自动根据任务目标将任务拆解成一个个的小任务,并且逐个完成,简单且高效。
人工智能AI库Spleeter免费人声和背景音乐分离实践(Python3.10)
在视频剪辑工作中,假设我们拿到了一段电影或者电视剧素材,如果直接在剪辑的视频中播放可能会遭遇版权问题,大部分情况需要分离其中的人声和背景音乐,随后替换背景音乐进行二次创作,人工智能AI库Spleeter可以帮我们完成大部分素材的人声和背景音乐的分离流程。 Spleeter的模型源来自最大的音乐网站Deezer,底层基于深度学习框架Tensorflow,它可以通过模型识别出素材中的背景音乐素材,从而判断出哪些是背景音乐,哪些是外部人声。
成为钢铁侠!只需一块RTX3090,微软开源贾维斯(J.A.R.V.I.S.)人工智能AI助理系统
梦想照进现实,微软果然不愧是微软,开源了贾维斯(J.A.R.V.I.S.)人工智能助理系统,贾维斯(jarvis)全称为Just A Rather Very Intelligent System(只是一个相当聪明的人工智能系统),它可以帮助钢铁侠托尼斯塔克完成各种任务和挑战,包括控制和管理托尼的机甲装备,提供实时情报和数据分析,帮助托尼做出决策等等。 如今,我们也可以拥有自己的贾维斯人工智能助理,成本仅仅是一块RTX3090显卡。
好饭不怕晚,Google基于人工智能AI大语言对话模型Bard测试和API调用(Python3.10)
谷歌(Google)作为开源过著名深度学习框架Tensorflow的超级大厂,是人工智能领域一股不可忽视的中坚力量,旗下新产品Bard已经公布测试了一段时间,毁誉参半,很多人把Google的Bard和OpenAI的ChatGPT进行对比,Google Bard在ChatGPT面前似乎有些技不如人。 事实上,Google Bard并非对标ChatGPT的产品,Bard是基于LaMDA模型对话而进行构建的,Bard旨在构建一个对话式的AI系统,使其能够更好地理解人类语言,并且具备进行多轮对话的能力。而GPT的目标是生成自然语言文本。
人工智能机器学习底层原理剖析,人造神经元,您一定能看懂,通俗解释把AI“黑话”转化为“白话文”
按照固有思维方式,人们总以为人工智能是一个莫测高深的行业,这个行业的人都是高智商人群,无论是写文章还是和人讲话,总是讳莫如深,接着就是蹦出一些“高级”词汇,什么“神经网络”,什么“卷积神经”之类,教人半懂不懂的。尤其ChatGPT的风靡一时,更加“神话”了这个行业,用鲁迅先生形容诸葛武侯的话来讲:“多智而近妖”。 事实上,根据二八定理,和别的行业一样,人工智能行业内真正顶尖的天才也就是20%,他们具备真正的行业颠覆能力,可以搞出像ChatGPT这种“工业革命”级别的产品,而剩下的80%也不过就是普通人,每天的工作和我们这些人一样,枯燥且乏味,而之所以会出现类似“行业壁垒”的现象,是因为这个行
暗夜发光,独自闪耀,盘点网页暗黑模式(DarkMode)下的特效和动效,CSS3实现
众所周知,网页的暗黑模式可以减少屏幕反射和蓝光辐射,减少眼睛的疲劳感,特别是在夜间使用时更为明显。其实暗黑模式也给霓虹灯效应(Neon Effect)提供了发挥的环境。 霓虹灯效应是一种视觉效果,其特点是在深色背景上使用鲜艳的颜色来产生强烈的视觉冲击。这种效应通常用于设计海报、广告、标志和网页等。霓虹灯效应的作用在于吸引人们的注意力和增强品牌形象的辨识度,因为这种效果让人印象深刻且易于记忆,本次我们盘点适合暗色模式的网页特效,还是喜欢,暗夜的你。