• Empricorn@feddit.nl
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      1 year ago

      Naw, USB-A is much more secure. I plug that end into my power bank, throw it in a bag or my pocket, and it’ll disconnect maybe 1 time out of the 100 that the USB-C or Lightning end does. It is a little larger, though.

    • TWeaK@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      19
      ·
      1 year ago

      I just wish they didn’t come with chips inside our cables.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        17
        ·
        edit-2
        1 year ago

        You need that for power regulation. One of the reasons that you can use a USB-C lead with anything is because all of the devices that require different power will just tell the cable that and the chip inside the cable deals with it. Otherwise there would have to be different cables for different voltage requirements.

        • TWeaK@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          1 year ago

          You don’t need it though. The power regulation is a decision between the load and the supply devices, the cable is an unnecessary third party. The cable should just be a multicore connection between two things, not a third device.

          If I had to go out on a limb though, I’d say it’s because manufacturers were selling cheap cables that didn’t meet the specification, and people were using them with higher power devices, causing overheating. By including a chip in the spec for the cable, you can push some of the responsibility back towards the cable manufacturer, and they can limit the maximum current to whatever they’ve designed to. In which case, we already do have different cables for different voltages - if your cable isn’t rated for 100W, then it might force a lower power even if your device and charger can do 100W. However it would be better if cable manufacturers would just meet the basic design specification to begin with, rather than creating unnecessary overhead.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            1 year ago

            It doesn’t make any difference either it’s between the supply and the device or it’s between the cable and the device it’s still two devices.

            By pushing the responsibility onto the cable it allows you to operate the cable directly from a USB port. So you can have things like electrical sockets with USB connections and you don’t have to have chips in the sockets, because typically they’re just dumb electrical interfaces. It also means that the device delivering the power doesn’t have to be actually fully switched on, so you can recharge your phone from a USB port on your computer and you don’t have to power the computer on. As long as there is an open electrical channel to the port the cable will deal with it all itself.

            Also it’s more efficient because you would have to have a control circuit in every single power delivery device, but this way you can have it in just the one cable, so now it is one chip for an unlimited number of power delivery devices.

            • TWeaK@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              So you can have things like electrical sockets with USB connections and you don’t have to have chips in the sockets, because typically they’re just dumb electrical interfaces.

              If the supply is dumb and cannot negotiate power, then there is no need to negotiate power and it will fall back on regular 5V USB. The same if the load is dumb. In this case, there is no need for a cable chip.

              It also means that the device delivering the power doesn’t have to be actually fully switched on, so you can recharge your phone from a USB port on your computer and you don’t have to power the computer on.

              If the USB port has power to it, the computer is supplying it. The voltage would be on but open circuit. The computer would not have to supply the negotiation circuitry until a cable has been connected end to end and the voltage circuit is closed.

              You’re trying to present this as the cable replacing one of the devices, but it doesn’t, it’s an extra 3rd device in the negotiation. All 3 devices must permit a certain charging level for that level to be used. It may have some benefit in ensuring that cable load capacity isn’t exceeded, but like I say it would be far better if the cables were reliably manufactured properly to handle the specified loads.

          • anotherandrew@lemmy.mixdown.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            The cable has to carry the negotiated power safely. It’s not unnecessary, it’s absolutely critical. I’ve personally seen and diagnosed the result of when this fails.

            For your low power applications there is no need and the spec allows for that.

            • TWeaK@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              It wouldn’t be critical if the cables were suitably rated for the specification. If you put a 0.5A cable in a 3A circuit, you’re gonna have a bad time. If you use a 3A or better cable, then you don’t need a cable chip to tell the actual devices to only work at 0.5A.

              • anotherandrew@lemmy.mixdown.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                How do you have the cable correctly identify itself if you don’t put some smarts in it? Or are you saying we should only be able to buy expensive cables fully rated for 100W (or higher as the spec has been updated) — and how do you prevent an older cable rated for 100W from being abused in a newer 200W circuit?

                Divider resistors are okay, but the IC is a better choice for future proofing and reliability.

      • Bobby Turkalino@lemmy.yachts
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        edit-2
        1 year ago

        A chip can literally just contain basic logic gates. Your aversion to them is based on pure Qanon fiction

        • TWeaK@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          My aversion to them is an aversion to unnecessary overhead. A cable is a cable, it shouldn’t be a third device.

        • anotherandrew@lemmy.mixdown.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          No, the chip is a microcontroller with firmware. You can try to do it in pure logic but it’s a waste of effort and development resources.