This talk provided a good overview of the study of the attempts to provide conditional lower bounds for various problems in P. Amir presented four (non-disjoint) types of hardness results (or rather classes of problems), grouped according to the assumption/conjectured used. These iclude SETH-hardness (i.e., assuming that $n$-vriable SAT instances cannot be solved in time $c^n$ for $c<2$), 3SUM-hardness (i.e., assuming that 3SUM has no sub-quadratic (i.e., $n^c$ for $c<2$) time algorithms), APSP-hardness (i.e., assuming that all-pairs shortest path cannot be solved in sub-cubic time), and the $k$-clique conjecture (which refers to beating the best algorithms known).
The class P attempts to capture the efficiently solvable computational tasks. It is full of practically relevant problems, with varied and fascinating combinatorial structure. In this talk, I will give an overview of a rapidly growing body of work that seeks a better understanding of the structure within P. Inspired by NP-hardness, the main tool in this approach are combinatorial reductions. Combining these reductions with a small set of plausible conjectures, we obtain tight lower bounds on the time complexity of many of the most important problems in P. I will present the most recent landscape of P and the conjectures on which this project is based on (e.g. the Strong Exponential Time Hypothesis). I will discuss recent attempts on identifying new conjectures: either more reliable ones, or ones that will get us closer to a full classification of the important problems in P. Finally, I will highlight a surprising new reduction from Circuit-SAT to natural problems in P like Edit-Distance which proves that minor improvements over the quadratic running time of Edit-Distance are enough to prove major complexity separations.