Supply Chain Management
In this research, we propose to extend the robust optimization technique and make it more suitable for problems encountered in SCM. We explore the use of intuitive constraints on aggregates, sums, differences, etc and investigate how capacity/inventory planning can be carried out in this scenario. We use primarily heuristic optimization methods, since the non-convexity of these problems, especially with price breakpoints, precludes polynomial time algorithms. These methods could include advanced implementations of simulated annealing and genetic algorithms, coupled with integer linear programming. We also investigate and extend methods based on Information theory to quantify the amount of information driving the supply chain. These methods have the potential to quantitatively compare different sets of assumptions about the future.
Supply Chain Management Database
This is a new kind of DBMS paradigm for storing linear constraints, quadratic constraints and convex constraints. Any of the current database paradigms fail to store these constraints in a structured way. Our DBMS stores the constraints in a compressed polytope way. This database is useful in large scale optimization problem to store the constraints. Many of the basic relational database properties are also included in this database.
Decision Support Methods under Uncertainity - Volume Estimation of Convex Polytopes in High Dimensions
A large number of applications use linear constraints to specify uncertainty. These linear constraints are the set of linear inequalities, which are used to define the demand/supply in the area of supply chains. The set of linear inequalities forms a polytope, the volume of which represents the information content. Calculating the exact volume is a NP hard problem so we need some approximation algorithm which can calculate volume within a certain percentage of error. Our existing volume estimation module calculates the approximate volume of a covex body polytope using uniform sampling method (works upto ~6 dimensions). Existing third party softwares like Vinci, Qhull etc can caculate exact volume upto 9-10 dimensions, after which they stop working. Presently we are working on Monte Carlo technique which is a randomized approximation algorithm. Our aim is to extend the work so that it works for 100 dimensions or more in polynomial time with an acceptable error. The volume will be used with information theoritic concepts to estimate the information content of a polytope/scenario. The main challenge is dimensionality of the polytopes (or in general convex bodies) large problems can have thousands of dimensions, challenging the fastest computational geometry algorithms known to date.