Live2D is a set of software and tools that turn 2D artwork into animations applicable to a variety of mediums and platforms, from games to animated movies.
This project is a electron version of live2d-html project. It can interact with your mouse click and native speaking by doing some actions for you.
All listening and speaking of the robot are in chinese, and there are no english version of the robot now. (But you can just put it on your desktop for ... decoration?)
- Tracking cursor
- Lip sync with the voice
- Chatting in chinese
- Replacing model
- Import user model
- Other feautres of chatting robot(searching, asking weather...)
It uses zhima(知麻) robot to translate the chinese sentences to the intent codes. With a hand writing execution engine, it can execute the instructions of user which it understood. It also use xunfei(讯飞) engine to achieve speech recognition and synthesis.
Important!! You will need to ask us the
conf.tsfor building this project, because it contains the secret keys of third-party services. Send an email to Sunxfancy, Norgerman or create a new issue.
Clone this git repository with the following shell code:
git clone https://github.com/Norgerman/SeityanInstall all the dependences with npm i:
cd ./Seityan
npm iRun building script:
npm run buildOr, you can build the code in the watching mode.
npm run watchTo run the project, using npm start:
npm startMIT License for our project.
