Hi Jogg Ai Team, if you can, please pass this on to the product development team.
A feature that is really needed is lip sync for avatars that are walking or dancing. Right now the current lip sync on avatar x is only for avatars that are standing still and swaying side to side in the same spot. The only model i have seen so far with an option to do lip sync with movement like walking while singing or dancing, or adding lip sync to a previously ai generated video that has real movement is Dzine's lipsync model. I would not mind using ai credits if jogg ai had the Dzine lipsync model. That would be a true upgrade. Can you pass a message to the development team to look into dzine. Their contact is contact@dzine-ai-connect.com
Also it would be nice if you all could add the runway aleph model. Runway aleph allows you to edit a previously ai generated video with a prompt similar to how nano banana does prompt to image editing.
Here is the info for runway aleph:
Primary Contact Methods:
Developer Portal: dev.runwayml.com
Enterprise Sales: runwayml.com/enterprise
API Documentation: docs.dev.runwayml.com
For Enterprise/Commercial Licensing for Aleph Model:
Enterprise Team Contact: Available through the enterprise page for custom commercial arrangements
Business Development: Contact via the enterprise portal for partnership discussions
API Partnerships: Runway works with strategic partners and large organizations for Aleph access
Third Party Aleph API Providers:
Replicate: replicate.com/runwayml/gen4-al... (Official model hosting)
Segmind: api.segmind.com/v1/runway-gen4... (Verified third party access)
KIE.ai: docs.kie.ai/runway-api/generat... (API documentation)
Thanks
To leave a comment, please authenticate.