Machine learning has made remarkable strides in recent years, with models achieving human-level performance in numerous tasks. However, the real challenge lies not just in developing these models, but in utilizing them efficiently in real-world applications. This is where inference in AI becomes crucial, surfacing as a critical focus for scientists