Generic 2
These exists
- KPI capture , export and analysis
- This capture needs to be baked into the process flow
- But be performant - i.e. it is persisted outside of production
system so ETL & analysis functions done outside
- Support stock / location concepts
- Incoming delivery = SKU expected
- Received delivery = homogenous PO / job / fulfilment
level location, i.e. not actualised yet
- Booked in delivery = SKU + QTY + Location
- Pick SKU = debit SKU avail and Dr location balance; CR Location
capacity
- Picked stock = homogenous location + job order - not at SKU
- Data Input = actualised SKU's so SKU + QTY + Location
- Purchase Orders
- Bucket concept
- PO for jobs
- PO for fulfilment
-
- Performance - set targets -
- # users - e.g 100 internal; 50 external
- # orders - linear projections - see volumes
- etc..
- Interconenctivity
- Incoming
- Ability to consume external data - weborders, sku data,
pricing data, orders
- Need to have ability to insert and well as update
- Need to be able to accommodate differing data sources,
so need to think of some kind of import profile which contains
a config as well the data source
- This might be an external tool such as Znyk but needs to
be usable by power users
- Performance Perfomance Perfomance
- Control - how about we stage data consumption so it can
be validated / corrected if need be and then background insert
/ update ?
- Pseudo API
- Outgoing
- Define the external data points
- Accounting
- Web Portals
- Reporting / Dashboard
- Suppliers
- Again need some form of generic tooling that can be via
configs accommodate the spcificities of external needs
- Bottom line - pushing data out to an New external point
should only require changes to a configs not creation of new
extract process
- Scaling