• 2 Posts
  • 128 Comments
Joined 5 months ago
cake
Cake day: September 22nd, 2025

help-circle








  • Discussion
    Our main finding is that using AI to complete tasks that require a new skill (i.e., knowledge of a new Python
    library) reduces skill formation.

    The erosion of conceptual understanding, code reading, and debugging skills that we measured among participants using AI assistance suggests that workers acquiring new skills should be mindful of their reliance on AI during the learning process.

    Among participants who use AI, we find a stark divide in skill formation outcomes between high-scoring interaction patterns (65%-86% quiz score) vs low-scoring interaction patterns (24%-39% quiz score). The high scorers only asked AI conceptual
    questions instead of code generation or asked for explanations to accompany generated code; these usage
    patterns demonstrate a high level of cognitive engagement.
    Contrary to our initial hypothesis, we did not observe a significant performance boost in task completion
    in our main study.

    Our qualitative analysis reveals that our finding is largely due to the heterogeneity in how participants decide to use AI during the task.

    These contrasting patterns of AI usage suggest that accomplishing a task with new knowledge or skills does not necessarily lead to the same productive gains as tasks that require only existing knowledge.
    Together, our results suggest that the aggressive incorporation of AI into the workplace can have negative impacts on the professional development workers if they do not remain cognitatively [ sic ] engaged. Given time constraints and organizational pressures, junior developers or other professionals may rely on AI to complete tasks as fast as possible at the cost of real skill development. Furthermore, we found that the biggest difference in test scores is between the debugging questions. This suggests that as companies transition to more AI code writing with human supervision, humans may not possess the necessary skills to validate and debug AI-written code if their skill formation was inhibited by using AI in the first place.










  • Well it’s a couple of things.

    First off, a wireless transmission speed of 120Gbps sounds really impressive but remember from the Shannon-Hartley theorem that the maximum channel capacity is just a function of bandwidth and SNR. This means that you can get an arbitrarily high transmission speed by increasing bandwidth to an obscene amount and/or by increasing SNR (by transmitting at an obscenely high transmission power).

    In the paper they say that the transmit power was 15 dBm which is a normal transmit power for WiFi, so it’s the 40GHz bandwidth that’s doing the heavy lifting in allowing that data rate.

    The second thing is that WiFi 6 (for example) uses 1.2 GHz of bandwidth in the 6GHz range, divided into seven non-overlapping 160MHz channels. WiFi 5 uses about nine 80MHz channels in the 5GHz range, and so on. So if you want to use the technology demonstrated in the paper for WiFi (as the headline of the article is suggesting) then you’d need a bunch of 40GHz channels in the higher ~200-300 GHz range which would be in the very high microwave range, bordering on far infra-red!

    If you want to imagine how useful that would be, just think about how useful your infra-red TV remote is. You would only be able to do line-of-sight point-to-point links at that frequency.

    IR point-to-point links already exist, and the silicon they invented for this paper is impressive, but the hype around it being a possible future WiFi standard doesn’t really hold up to basic inspection.