The efficiency of the Input Vector Control (IVC) method in sub-micron technologies is examined. The IVC is a well known method for leakage current reduction during standby mode in digital circuits. Using the IVC method, a unique combination of input signal values, also referred to as a Minimum Leakage Vector (MLV), can be found for every circuit to minimize its standby leakage currents. While the IVC method was thoroughly examined for old process nodes, its efficiency in sub-micron technologies has not been investigated. Increases in gate and sub-threshold leakages, as well as the introduction of Deep Sub-Micron (DSM) effects that accompany technology scaling, significantly affect the MLV of a digital gate. In this paper, the efficiency of IVC for basic CMOS gates was examined in 90nm, 65nm and 40nm standard CMOS technologies. The impact of gate sizing on the MLV was studied at different process corners. Simulation results show that while the IVC method is still efficient for advanced technologies, a gate's MLV is less stable with respect to the gate size and process variations.