Short answer: Not anymore. Lag generally doesn't happen anymore with overdrive.
Long answer:
"yes" and "no".
Lag from overdrive is no longer used in most modern gaming monitors. But there used to be lag from overdrive. As recently as a few years ago. There's older overdrive algorithms that simply framebuffer the input, and computes the overdrive before running an LCD scanout. Adds another +16.7ms of input lag at 60Hz
However, modern LCD displays can do realtime overdrive (based on a buffer of the PREVIOUS refresh cycle only) -- while real-time scanning out the current refresh cycle. There's probably one or two pixel rows of buffer to give enough time for math calculations and stuff (and "previous-scanline" realtime scaling algorithms) -- a monitor can do almost everything (e.g. picture processing, scaling processing, AND overdrive processing) in realtime using just a "previous-scanline" buffering. But a single scan line -- one single row of pixels -- at 135KHz horizontal scan rate of 1080p@120Hz -- is only 1/135,000th of a second (less than 0.01 millisecond input lag.) Single-scanline-buffer processing has pretty much unmeasurable input lag in the light of 1ms GtG's. Even a 1 degree cooler temperature (LCDs respond slower in the cold) actually add more input lag than a single scanline!
There's more than one way to do overdrive. The old way was to do it as a full framebuffer, sometimes simpler electronically and computer-programming-wise if you have enough RAM, but it's mathematically unnecessary.
Today, HDR is the new engineering challenge for input lag. HDR is easier to do on a framebuffer basis, but you can do it in realtime -- if you have the proper chips (e.g. FPGA) and minimizing of processing dependancies (e.g. compression algorithms that depends on future data). It takes a lot of time to engineer new monitor capabilities using precision clock-cycle-perfect realtime programming (DSPs, ASICs, FPGAs, etc), especially when we hit dotclocks of 600 million pixels per second (1080p@240Hz or 4K@60Hz). Almost 2 billion subpixels per second when you divide into R/G/B which all have to be processed separately. In addition to high-quality real-time scaling algorithms and overdrive algorithms, that's a lot of computing power if you're needing floating-point HDR thrown into the mix too... So some HDR displays have added input lag due to HDR, but not all of them. Early-release of new monitor technologies often have more input lag than newer release for a specific technology such as HDR. Until it's refined into a more realtime technology (that doesn't require full frame buffering).
Input lag from turning on overdrive -- in a gaming monitor -- still happened a few years ago. One early 120Hz monitor, the Samsung SA series -- such as S23A700D for example. Its overdrive added 1 frame of input lag. This was a catch-22 because keeping overdrive turned off made the "
Samsung 3D mode" (which doubled as an amazingly good blur reduction strobe backlight) look really bad unless you enabled an overdrive setting ("Response Time" = "Normal"), which then suddenly added 1 extra frame of input lag. Now strobing + at least 2 frames of input lag. You could feel the lag. Ouch!
Today, for BENQ/Zowie monitors, they use essentially lagless overdrive, so you're not going to be able to measure input lag differences from turning overdrive on/off. If you use a microsecond-accurate photodiode, to a lab workshop oscilloscope machine, with VSYNC-wire (e.g. VGA/DVI), you might see a few tens/hundreds
microseconds less input lag with overdrive enabled -- because of earlier-GtG completion thanks to overdrive.
Now, some panel technologies may be extremely tricky and require forward-AND-backward overdrive (lookahead algorithms), and still need to add input lag during overdrive. But technically, lookahead is unnecessary.