fc大语言模型部署+本地知识库

发布时间 2023-10-29 11:01:09作者: freedragon

 

 

FC Invoke Start RequestId: 930989fb-8910-400d-b981-1de87e89a3e3Info: @serverless-cd/engine: 0.0.51, linux-x64, node-v14.19.2
plugin @serverless-cd/checkout has been installed
plugin @serverless-cd/s-setup has been installed
plugin @serverless-cd/s-deploy has been installed
Start checkout plugin
Execute command: s init fc-llm --parameters '{"appName":"pg-chatglm2-6b-webui","functionName":"fc-llm-afma-sljnyc4r","llmModel":"chatglm2-6b-int4","region":"cn-hangzhou","roleArn":"acs:ram::1650087321066347:role/aliyunfcdefaultrole","serviceName":"fc-llm"}' --project fc-llm -d /kaniko/tmp/workspace 



?  More applications: https://registry.serverless-devs.com
Downloading[/simple/fc-llm/zipball/0.0.12]...
Download fc-llm successfully

?‍  Thanks for using Serverless-Devs
?  You could [cd /kaniko/tmp/workspace] and enjoy your serverless journey!
?️  If you need help for this example, you can use [s -h] after you enter folder.
?  Document ❤ Star: https://github.com/Serverless-Devs/Serverless-Devs
?  More applications: https://registry.serverless-devs.com

End checkout plugin
start @serverless-cd/s-setup run
@serverless-devs/s3: 0.0.8, s-home: /root/.s, linux-x64, node-v16.16.0

Alias:      default
Credential: 
  AccessKeyID:     STS***********************E9n
  AccessKeySecret: Hwu**************************************Jp3
  __provider:      Alibaba Cloud
  SecurityToken:   CAI******************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************************SAA
  AccountID:       165**********347

Run @serverless-cd/s-setup end
start @serverless-cd/s-deploy run
Execute command: s deploy --use-local --assume-yes --skip-push -t s.yaml


[2023-10-29 02:14:45] [INFO] [S-CORE] - It is detected that your project has the following projects < chatglm6b-server,llm-model-download,pgvector_llm-pgvector_llm > to be execute
[2023-10-29 02:14:45] [INFO] [S-CORE] - Start executing project chatglm6b-server
Checking Service fc-llm exists
Creating vpc: fc-deploy-component-generated-vpc-cn-hangzhou
Creating vswitch: fc-deploy-component-generated-vswitch-cn-hangzhou
Creating securityGroup: fc-deploy-component-generated-securityGroup-cn-hangzhou
Creating NasFileSystem: Alibaba-FcDeployComponent-DefaultNas-cn-hangzhou
Creating MountTarget: 07e2f4ba53
Checking Service _FC_NAS_fc-llm exists
Checking Function nas_dir_checker exists
Checking Triggers
Creating Service _FC_NAS_fc-llm...
Creating Function _FC_NAS_fc-llm/nas_dir_checker...
Creating Trigger...
custom container acceleration status...
Checking Function chatglm exists
Checking Triggers
Creating Service fc-llm...
Creating Function fc-llm/chatglm...
Creating Trigger...
custom container acceleration status...

There is auto config in the service: fc-llm

Tips for next step
======================
* Display information of the deployed resource: s info
* Display metrics: s metrics
* Display logs: s logs
* Invoke remote function: s invoke
* Remove Service: s remove service
* Remove Function: s remove function
* Remove Trigger: s remove trigger
* Remove CustomDomain: s remove domain



[2023-10-29 02:15:20] [INFO] [S-CORE] - Project chatglm6b-server successfully to execute 
    
[2023-10-29 02:15:20] [INFO] [S-CORE] - Start executing project llm-model-download
[2023-10-29 02:15:20] [INFO] [S-CORE] - Start the pre-action
[2023-10-29 02:15:20] [INFO] [S-CORE] - Action: npm i
added 61 packages from 32 contributors and audited 62 packages in 7.859s

6 packages are looking for funding
  run `npm fund` for details

found 0 vulnerabilities

[2023-10-29 02:15:28] [INFO] [S-CORE] - End the pre-action
Checking Service fc-llm exists
Checking Function llm-model-download exists
Creating Service fc-llm...
Creating Function fc-llm/llm-model-download...
custom container acceleration status...

Tips for next step
======================
* Display information of the deployed resource: s info
* Display metrics: s metrics
* Display logs: s logs
* Invoke remote function: s invoke
* Remove Service: s remove service
* Remove Function: s remove function
* Remove Trigger: s remove trigger
* Remove CustomDomain: s remove domain



[2023-10-29 02:15:37] [INFO] [S-CORE] - Start the after-action
[2023-10-29 02:15:37] [INFO] [S-CORE] - Action: fc ondemand put --qualifier LATEST --max 1
[2023-10-29 02:15:37] [INFO] [FC] - Updating on-demand: fc-llm.LATEST/llm-model-download
[2023-10-29 02:15:37] [INFO] [S-CORE] - Action: fc invoke --service-name fc-llm  --function-name llm-model-download
invoke function: fc-llm / llm-model-download


========= FC invoke Logs begin =========
FC Invoke Start RequestId: 1-653dc049-a14ede89686b27baa7498a28
load code for handler:index.handler
2023-10-29T02:15:50.091Z 1-653dc049-a14ede89686b27baa7498a28 [verbose] Downloading[/chatglm2-6b-int4/pytorch_model.bin]...
2023-10-29T02:18:28.009Z 1-653dc049-a14ede89686b27baa7498a28 [error] Download failed
FC Invoke End RequestId: 1-653dc049-a14ede89686b27baa7498a28

Duration: 158094.32 ms, Billed Duration: 158095 ms, Memory Size: 3072 MB, Max Memory Used: 3044.87 MB
========= FC invoke Logs end =========

FC Invoke instanceId: c-653dc049-a0c791277ae94c31a7b1

FC Invoke Result:
download success


[2023-10-29 02:18:28] [INFO] [S-CORE] - Action: fc nas upload -r ./code/pg-chatglm2-6b-webui /mnt/auto/llm/pg-chatglm2-6b-webui

Checking Service _FC_NAS_fc-llm exists
Checking Function nas_dir_checker exists
Checking Triggers
Creating Service _FC_NAS_fc-llm...
Creating Function _FC_NAS_fc-llm/nas_dir_checker...
Creating Trigger...
custom container acceleration status...