AI-Enabled Customised Workflows for Smarter Supply Chain Optimisation: A Feasibility Study

Vahid Javidroozi* (Corresponding / Lead Author), Abdel-Rahman Tawil, R. Muhammad Atif Azad, Brian Bishop, Nouh Elmitwally (Corresponding / Lead Author)

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    Abstract

    This study investigates the integration of Large Language Models (LLMs) into supply chain workflow automation, with a focus on their technical, operational, financial, and socio-technical implications. Building on Dynamic Capabilities Theory and Socio-Technical Systems Theory, the research explores how LLMs can enhance logistics operations, increase workflow efficiency, and support strategic agility within supply chain systems. Using two developed prototypes, the Q inventory management assistant and the nodeStream© workflow editor, the paper demonstrates the practical potential of GenAI-driven automation in streamlining complex supply chain activities. A detailed analysis of system architecture and data governance highlights critical implementation considerations, including model reliability, data preparation, and infrastructure integration. The financial feasibility of LLM-based solutions is assessed through cost analyses related to training, deployment, and maintenance. Furthermore, the study evaluates the human and organisational impacts of AI integration, identifying key challenges around workforce adaptation and responsible AI use. The paper culminates in a practical roadmap for deploying LLM technologies in logistics settings and offers strategic recommendations for future research and industry adoption.
    Original languageEnglish
    JournalApplied Sciences (Switzerland)
    Volume15
    Issue number17
    DOIs
    Publication statusPublished (VoR) - 27 Aug 2025

    Fingerprint

    Dive into the research topics of 'AI-Enabled Customised Workflows for Smarter Supply Chain Optimisation: A Feasibility Study'. Together they form a unique fingerprint.

    Cite this