Showing 4 questions
Implement a multithreaded web crawler to efficiently explore and retrieve pages from a website, respecting the website's rules.
#1. Web Crawler Multithreaded
You are given file paths and their content. Find and return groups of duplicate files based on their content.
#2. Find Duplicate File in System
Simulate a web crawler to visit a website and collect webpages following specific rules. You'll need to avoid revisiting pages and stay within the same domain.
#3. Web Crawler
Calculate the exclusive execution time for each function given a series of log entries marking function calls and returns. The exclusive time is the time spent within a function, not including time spent in its callees.
#4. Exclusive Time of Functions