Once, I was asked this question in the interview, “Suppose, there is a very large-sized sorted singly linked list, and you have to search for an element in it efficiently. Which algo will you use and what will be its time complexity?”

Can someone please answer this?

Yes @Archit, here you will require a smart stragtegy like binary search since the list is sorted. As for a fairly large sized sorted list of items using only a mere linear search would in complexity terms be an O(n) search i.e. the time taken to search the list gets bigger at the same rate as the list does. Whereas, A binary search algorithm is when you start with the middle of a sorted list, and see whether that’s greater than or less than the value you’re looking for, which determines whether the value is in the first or second half of the list. Jump to the half way through the sublist, and compare again etc. This is pretty much how humans typically look up a word in a dictionary (although we use better heuristics, obviously - if you’re looking for “cat” you don’t start off at “M”). In terms of complexity this takes about an `O(log n)`

search i.e. the number of search operations grows more slowly than the list does, because you’re halving the “search space” with each operation.

To get to the middle of the linked list wouldnt you have to do (n/2) operations?

Though I am a late to reply…The interviewer himself gave the answer at the moment he uttered sorted…Binary search is the way to go for every linear sorted structure