1、Platform:How AI is Enabling New Capabilities and FeaturesMike Shchelkonogov,Gabriel Michaud2Presented byMike ShchelkonogovCTOGabriel MichaudFounder&CEOAI Today4History of LLM1967Eliza1997LSTM2010StanfordCoreNLP2011Google Brain2020Google Transformer Architecture2017Open AI GPT-35Raise of LLM1967Eliza
2、1997LSTM2010StanfordCoreNLP2011Google Brain2020Google Transformer Architecture2017Open AI GPT-3What enables LLMNLPTransformers ArchitectureAttention EmbeddingWhat distinguishes LLMPre trained on huge data setsExceptionally large number of parametersWhere LLM can be appliedConversational AIUnderstand
3、ing SentimentsMachine TranslationText Content Generation6What Are LLM Challenges And LimitationsLack of Real UnderstandingGenerative Errors and Unreliable InformationLack of ExplainabilityFine Tuning ChallengesDifficulty in Handling Long-Term ContextNeed for Massive Amounts of DataComputational Reso
4、urcesEthical Concerns7What To ExpectLLM is just a one sample of transformers architectureTransformers have a wide application outside of NLP Speech and Video recognitionSpeech and Video generationPhysical modelingPhysical process analysisDNA analysismany moreLLM and other types of transforms will be
5、come a commodity as a technologyWe can expect all major vendors to focus on providing transformers as a serviceAcumatica Realities9How Do We Use LLMKnowledge Base and Support AutomationSearch and EducationContent generationTranslation to foreign languagesCoding as another form of content generationP
6、latform codeApplication codeQA Automation10Scenario We Work For ApplicationCopilotEmail ComposingSentiment AnalysisBuilt in Content GenerationConverting text or speech to actions(chat functionality)CustomizationScript GenerationCode GenerationCommon Tasks Auto