September 18

Add to Calendar 2024-09-18 16:00:00 2024-09-18 17:00:00 America/New_York Sparse Graph Counting and Kelley--Meka Bounds for Binary Systems Abstract: Graph counting lemma of Chung, Graham, and Wilson (Combinatorica 1988) is a fundamental result in combinatorics, which states that if a large graph is pseudorandom, then the number of copies of any small graph $H$ in $G$ is close to what is expected from a random graph of the same density. However, this result is only nontrivial when $G$ is a dense graph. In this work, we obtain a counting lemma that works in the sparse setting, and it is well-suited for the density increment arguments in additive combinatorics. In a recent remarkable breakthrough, Kelley and Meka (FOCS 2023) obtained a strong upper bound on the density of sets of integers without nontrivial three-term arithmetic progressions. We combine our counting lemma with other ideas to establish Kelley--Meka type bounds for all linear patterns defined by translation-invariant systems of binary linear forms, i.e., each form depends on exactly two variables. In particular, we obtain strong bounds for the Turan problem in Abelian Cayley sum graphs, i.e. an upper bound on the maximum edge density of an Abelian Cayley sum graph with a clique of a given size. To prove our results, we employ some of the recent technology developed by Kelley and Meka and also the follow-up work by Kelley, Lovett, and Meka (STOC 2024). This talk is based on a joint work with Yuval Filmus, Hamed Hatami and Kaave Hosseini. 32-G575

October 02

Add to Calendar 2024-10-02 16:00:00 2024-10-02 17:00:00 America/New_York Capacity Threshold for the Ising perceptron Abstract: We show that the capacity of the Ising perceptron is with high probability upper bounded by the constant $\alpha \approx 0.833$ conjectured by Krauth and Mézard, under the condition that an explicit two-variable function $S(\lambda_1,\lambda_2)$ is maximized at (1,0). The earlier work of Ding and Sun proves the matching lower bound subject to a similar numerical condition, and together these results give a conditional proof of the conjecture of Krauth and Mézard. G575

October 08

Add to Calendar 2024-10-08 12:00:00 2024-10-08 12:45:00 America/New_York Tackling climate change as a research software engineer ## Title:Tackling climate change as a research software engineer## Abstract:In this talk I would like to give a high-level overview of the work we carry out in ICCS. How we work with our scientific collaborators to solve research problems and in-house software we have developed to facilitate the addition of ML into legacy weather and climate models. Specifically, we have several groups working with Fortran codes which are now looking to incorporate PyTorch ML models into their existing programs. I also have a keen interest in tooling and I would like to introduce an MPI-aware debugger called mdb, which I am currently developing.## Bio:Tom Meltzer is a senior Research Software Engineer working at the Institute of Computing for Climate Science (ICCS), based at the University of Cambridge. He specializes in high-performance computing, working with scientists to optimize and improve their software to drive better research outcomes. He is currently working on a next generation sea-ice model (NextSimDG), writing a parallel backend using MPI. Before transitioning to research software engineering, he was a atomic and molecular physicist. He obtained his PhD from University College London. Seminar Room D507

October 09

Add to Calendar 2024-10-09 16:00:00 2024-10-09 17:00:00 America/New_York Proofs of Space with Maximal Hardness In a proof of space, a prover performs a complex computation with a large output. A verifier periodically checks that the prover still holds the output. The security goal for a proof of space construction is to ensure that a prover who erases even a portion of the output has to redo a large portion of the complex computation in order to satisfy the verifier.In existing constructions of proofs of space, the computation that a cheating prover is forced to redo is a small fraction (vanishing or small constant) of the original complex computation. The only exception is a construction of Pietrzak (ITCS 2019) that requires extremely depth-robust graphs, which result in impractically high complexity of the initialization process.We present the first proof of space of reasonable complexity that ensures that the prover has to redo almost the entire computation (fraction arbitrarily close to 1) when trying to save even an arbitrarily small constant fraction of the space. Our construction is a generalization of an existing construction called SDR (Fisch, Eurocrypt 2019) deployed on the Filecoin blockchain. Our improvements, while general, also demonstrate that the already deployed construction has considerably better security than previously shown.Technically, our construction can be viewed as amplifying predecessor-robust graphs. These are directed acyclic graphs in which every set of $\pi \cdot n$ nodes contains a subset of $\alpha_\pi \cdot n$ nodes whose induced subgraph has just one sink. We take a predecessor-robust graph with constant parameters $(\pi, \alpha_\pi)$, and build a bigger predecessor-robust graph with a near-optimal set of parameters and additional guarantees on sink placement, while increasing the degree only by a small additive constant. G575