International Athletic Apparel Brand Scales to Meet Business Demands with Apache Kafka® Supply Chain Management Automation.
Global Athletic Apparel Brand with 400+ locations and a multi-billion-dollar business, automates supply chain management process with AIM Consulting’s implementation of Apache Kafka®.
The company utilized a variety of systems to source raw materials from numerous mills and suppliers to create its proprietary fabric and turn it into products. The process was manual to aggregate and communicate data to the internal ordering systems and lacked automation.
On its path to maturation, the brand had chosen to implement Apache Kafka, an open-source distributed messaging system that connects different data pipelines so that users can initiate work orders that flow to downstream systems with automated trigger events for ordering the exact amount, color and type of fabric needed to produce the company’s products.
The organization initially hired a consulting company specializing in Apache Kafka to own the implementation. However, the consulting company did not specialize in enterprise solutions and the implementation fell far short of expectations. While the implementation included very basic business logic with Kafka, it lacked a cloud-deployable distributed microservices architecture, DevOps with CI/CD, metrics, and telemetry, all of which are baseline requirements for the enterprise to realize the true value of the technology.
As the project’s end date loomed, the firm’s tendency to work in isolation eroded trust. Rudimentary problems still existed while the consulting firm delivered more slowly and failed to answer complex questions from stakeholders. Realizing the need for a fresh approach with a more holistic understanding of business objectives with Kafka and what would be needed to achieve them, project leaders looked for expert guidance in a technology consulting firm with enterprise-level experience to realign the project. Based on successful previous engagements, the stakeholders turned to AIM Consulting.
- An automated system robust enough to handle the company’s meteoric growth
- Streamlining the numerous systems for sourcing raw materials and automating the communicated data to internal ordering systems
- Supported the organization’s 20% YoY growth
The team choose AIM to oversee the implementation based on the thoroughness of AIM’s assessment and plan for the new Kafka solution architecture. After implementation, the team will have an automated system robust enough to handle the company’s meteoric growth.
AIM is also involved in additional discussions with IT leaders on how to leverage Kafka data integration across multiple divisions and systems. The eventual goal is to grow this initial implementation into a full enterprise data streaming solution.
Discovery & Solution Analysis Phase
System Design Phase
System Design Handoff
Enterprise Governance & Solution Design Phase
Enterprise Governance & Solution Design Handoff
AIM Consulting’s Application Development practice with deep expertise in complex enterprise-grade integrations provided a thorough assessment of the solution’s current state and a comprehensive architectural strategy and execution plan to lead the team out of its predicament and realize the business value they desired.
AIM first met with organizational subject-matter experts and thoroughly analyzed the existing supply-chain management system architecture, business requirements, and the stalled Kafka effort. Then, after comparing the information with several known successful enterprise Kafka solutions, AIM recommended a new Kafka architecture based on industry standards and best practices.
AIM’s advised architecture emphasized end-to-end automation in a microservices environment leveraging Docker containers and Kubernetes in a full cloud integration. Critically, the guidance was also cross-cloud compatible, as the company was planning to transition from Amazon Web Services (AWS) to Microsoft Azure cloud services in the future.
AIM’s recommendations included:
- Schema Registry, Kafka Connect, and KStream microservices using Spring Cloud framework
- Apache Avro schema
- Metrics from Kafka components and streaming applications
- Centralized enterprise logging
- Rule-based alerting to recognize excessive loads or outages
- Quality-driven development emphasizing unit testing and integration testing, automated and performed by the CI/CD system
- Multi-region disaster-recovery plan
Apache Kafka® Was Chosen For The Following Factors
Kafka is a fault-tolerant distributed system, allowing it to be deployed for applications needed for the organization’s internal ordering systems.
Kafka distributes data across multiple nodes for a highly available deployment within a single data center or across multiple availability zones.
Distributed Messaging System connecting different data pipelines for users to initiate work orders that flow to downstream systems with automated trigger events for ordering the exact amount, color and type of fabric needed to produce the company’s proprietary apparel.
Get In Touch
Whether you need help with technology strategy and implementation or have an in-flight project in need of additional resources, AIM is here to help.
Fill out the form below and one of our experts will be in touch.