Generative AI is revolutionizing incident management by shifting operations from reactive to proa... more Generative AI is revolutionizing incident management by shifting operations from reactive to proactive, enhancing efficiency, scalability, and accuracy in infrastructure systems. This paper explores the application of Generative AI in proactive incident management, detailing its capabilities in predictive analytics and automated resolution workflows. Using a real-world case study of an AI-powered incident management tool developed for global enterprises, we analyze the transformative impact of AI-driven solutions. Key challenges and best practices are identified, providing a roadmap for modern enterprises aiming to integrate Generative AI into their operational frameworks. Furthermore, the paper delves into the challenges of integrating AI into global operations, including data scalability, system complexity, and cross-functional collaboration. It highlights best practices for product managers, emphasizing the importance of defining measurable goals, fostering stakeholder alignment, and leveraging iterative development to create impactful solutions.
Organizations face significant challenges when deploying Large Language Models (LLMs) in producti... more Organizations face significant challenges when deploying Large Language Models (LLMs) in production environments. High computational requirements and resource constraints are the main causes of these challenges, particularly in situations where optimizing resources is necessary. While recent studies primarily focused on training efficiency, deployment optimization, and lifecycle management have received limited attention. To address this gap, we introduce (1) A comprehensive framework to evaluate resource-performance trade-offs across deployment scenarios, as well as metrics like Resource Utilization Efficiency (RUE) and Lifecycle Impact Factor (LIF), and (2) A thorough analysis of four well-known 7B-parameter models in various production environments, from cloud to edge computing. This study provides useful insights for organizations that use opensource LLMs in resource-constrained environments. Our findings offer practical guidelines for organizations to enhance their deployment strategies while sustaining performance standards and regulatory compliance.
This digital transformation of manufacturing processes presents unique challenges in adapting pro... more This digital transformation of manufacturing processes presents unique challenges in adapting proven retail forecasting methodologies to industrial settings. This paper introduces a novel cloud-native framework that bridges the gap between retail and manufacturing demand forecasting, leveraging advanced data analytics and scalable architecture. We present (1) an adaptive methodology for translating retail demand patterns to manufacturing contexts, (2) a cloud-native architecture supporting real-time data integration and scalable processing, and (3) empirical validation using public retail and manufacturing datasets. Our framework demonstrates significant improvements in forecast accuracy (15.0% reduction in Mean Absolute Percentage Error) and resource utilization (40% reduction in computing costs). Analysis across 50 manufacturing facilities shows that our retail-derived demand forecasting techniques effectively optimize manufacturing processes while maintaining 99.999% system availability. The framework achieves a 25% reduction in safety stock levels and 35% improvement in inventory turnover, providing practical insights for organizations pursuing digital transformation initiatives in manufacturing.
Uploads
Papers by Tingting Lin