Having a fast and responsive app is orthogonal to “knowing your big Os”. Unfortunately, most tech companies over-emphasize algorithms in interviews and downplay systems knowledge, and I believe that’s one reason behind sluggish apps and bloated systems.

I’ve seen this play out repeatedly. Interviewers ask a LeetCode-style coding question, which is then followed by the ritual of discussing time and memory complexity. Candidates ace the answers. But then… their “real” code suffers from subtle yet impactful performance problems.

  • @uniqueid198x@lemmy.dbzer0.com
    link
    fedilink
    110 months ago

    The vast majority of wall time for most uses is io. You need someone on your team to care about big o, but for most teams, its not the problem

  • blazera
    link
    fedilink
    010 months ago

    Besides an amazing anime, the heck is a big O?

    • @wizardbeard@lemmy.dbzer0.com
      link
      fedilink
      English
      2
      edit-2
      10 months ago

      Itcs a generalized method/notation of measuring how long a piece of code takes to run for a given input.

      O(n2) means that as the input n grows, it takes exponential time to process. Like if you were trying to find items that matched in an array by looping over every item in the array and then using a nested loop in the first one to compare against every other item in the array. You’d be doing (# of items) * (# of items) comparisons, or (# of items)2 comparisons. n2 comparisons.

      There’s some rules about simplifying the result down to the most significant portion, so On+n would be simplified as On, but that’s the basics.

      It’s a pretty important concept as you get into making larger projects.

    • sj_zero
      link
      fedilink
      -110 months ago

      Its when you find the clitoris for the first time.

      Joking aside, it’s a description of the runtime of a thing for a size of a data set. Its expressed as a function. So for example an exponential function would get longer and longer as your data set size grows, linear time has a basically proportional operating time compared to the size of the data set, and log(n) would see runtime increase very little as data set size increases.

  • @atheken@programming.dev
    link
    fedilink
    010 months ago

    Sure, you can make it 10x faster by optimizing the complexity, but I can improve throughput by 1000x by putting it on 1000 machines.

    In practice, it’s a bit of both. Paying attention to Big-O and understanding the gross relative cost of various operations goes a long way, but being able to run an arbitrarily large number of them can get you out of a lot of jams.

    • @huginn@feddit.it
      link
      fedilink
      110 months ago

      I can improve throughput by 1000x by putting it on 1000 machines

      My app is installed on 10000 phones! See how much better it runs!

    • snooggums
      link
      fedilink
      110 months ago

      And sometimes a co pletely different approach is better than focusing on the runtime of a single piece. We had a system that was sql code that ran many functions that scaled horribly as data was added to the system. Each function was as fast as could be measured, but the process of running them as separate repetitive functions ended up being over 24 hours in some cases.

      Redoing them as a single stored procedure and optimizing that reduced it to a few minutes. This approach was much more complex than the individual functions and took about a year to replace what was oroginally written in 3 months.

      Many times the effort to make something more efficient takes far longer than getting it to work in the first place, and many processes are slow because they did not allocate time to properly optimize or include staff that are able to optimize.