0%

书生·浦语大模型全链路开源体系课程笔记一

参加由上海人工智能实验室发起的《书生·浦语大模型全链路开源体系》的课程以及笔记记录

作业

InternLM2-Chat-1.8B

  • 使用 InternLM2-Chat-1.8B 模型生成 300 字的小故事

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    import torch
    from transformers import AutoTokenizer, AutoModelForCausalLM
    tokenizer = AutoTokenizer.from_pretrained("internlm2-chat-1_8b", trust_remote_code=True)
    # Set `torch_dtype=torch.float16` to load model in float16, otherwise it will be loaded as float32 and cause OOM Error.
    model = AutoModelForCausalLM.from_pretrained("internlm2-chat-1_8b", torch_dtype=torch.float16, trust_remote_code=True).cuda()
    model = model.eval()
    history=[]
    while True:
    question = input("请提问: ")
    if question == "quit": ### 键入 quit 终止对话
    print("已关闭对话")
    break
    else:
    response, history = model.chat(tokenizer, question, history=history)
    print('答:', response)

huggingface_hub

1
2
from huggingface_hub import hf_hub_download
hf_hub_download(repo_id="internlm/internlm2-chat-7b ", filename="config.json")