lab1

.pdf

School

University of Illinois, Urbana Champaign *

*We aren’t endorsed by this school

Course

5755

Subject

Electrical Engineering

Date

Dec 6, 2023

Type

pdf

Pages

5

Uploaded by DukeIce5246 on coursehero.com

Name: Carlos Suberviola NetID: cls383 Course: ECE 5755 Lab 1: 1. Profiling: a. convolution(): b. relu(): 0% 20% 40% 60% 80% 100% in_size(18) in_size(9) in_size(2048) Convolution Frontend_Bound Bad_Speculation Backend_Bound Retiring 0% 20% 40% 60% 80% 100% in_size(3) in_size(15) in_size(1088) ReLU Frontend_Bound Bad_Speculation Backend_Bound Retiring
c. linear(): d. matmul(): 0% 20% 40% 60% 80% 100% in_size(6) in_size(24) in_size(1008) Linear Frontend_Bound Bad_Speculation Backend_Bound Retiring 0% 20% 40% 60% 80% 100% in_size(4) in_size(36) in_size(1024) Matmul Frontend_Bound Bad_Speculation Backend_Bound Retiring
e. softmax(): In general, as input sizes became larger, the retiring rate increased, the frontend bound rate decreased, and bad speculation decreased. This is most pronounced in the matmul() and convolution() functions. Since the retiring rate is a measure of how quickly the CPU can complete instructions and the frontend bound rate is a measure of how much time the CPU spends waiting for instructions to be fetched and decoded, this means that the code ran more efficiently as the input size increased. Bad speculation also decreased, which means that less unnecessary operations were performed when larger inputs were provided. Since the backend bound does not change noticeably, the main bottlenecks are in the backend, likely due to long operations or because the memory system is inherently slow. 2. Function Implementation: a. convolution(): The convolution function iterates over all filters (numFilters) and all channels (numChannels). It then iterates over all output pixels (i, j), and calculates the weighted sum of the input pixels in the kernel area. Next, it adds up the channels per filter. Finally, it adds the bias term and applies the ReLU activation function. b. relu(): If value is greater than 0, return value. Else, return 0. 0% 20% 40% 60% 80% 100% in_size(3) in_size(10) in_size(1280) Softmax Frontend_Bound Bad_Speculation Backend_Bound Retiring
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help