Carbon-aware Federated Learning with Model Size Adaptation

dc.contributor.advisorDrew, Steve
dc.contributor.advisorWang, Xin
dc.contributor.authorAbbasi, Ali
dc.contributor.committeememberFar, Behrouz
dc.contributor.committeememberMoshirpour, Mohammad
dc.date2024-11
dc.date.accessioned2024-08-01T16:10:01Z
dc.date.available2024-08-01T16:10:01Z
dc.date.issued2024-07-23
dc.description.abstractDeveloping machine learning models heavily depends on the availability of data. Establishing a responsible data economy and safeguarding data ownership are essential to facilitate learning from distinct, heterogeneous data sources without centralizing data. Federated learning (FL) provides a collaborative framework that enables model development using data from geographically distributed clients, each characterized by unique carbon footprints associated with varying energy sources which can lead to significant carbon emissions when learning from these decentralized data from edge clients like smart phones and IoT devices. This variability in carbon intensity poses a substantial challenge in striving for environmentally sustainable model training with minimal carbon emissions. This thesis introduces innovative carbon-aware strategies within FL to mitigate total carbon emissions through strategic client engagement and resource allocation. We propose two distinct methods: (1) clustering clients based on data distribution and offsetting high carbon emissions with those exhibiting lower emissions, implemented through a client similarity matrix (FedGreenCS), and (2) adapting model sizes based on the carbon intensity of client locations (FedGreen), employing model compression techniques. Our results affirm the effectiveness of both approaches in harmonizing model performance with environmental impact, underscoring their potential as sustainable solutions in distributed learning scenarios. We conduct a theoretical analysis of the trade-offs between carbon emissions and convergence accuracy, taking into account the carbon intensity disparities across different regions to optimally select parameters. Empirical studies reveal that model size adaptation significantly reduces the carbon footprints of FL, surpassing contemporary methods while maintaining competitive accuracy. This research also highlights the viability of client selection and model adaptation as sustainable strategies in distributed learning contexts.
dc.identifier.citationAbbasi, A. (2024). Carbon-aware federated learning with model size adaptation (Master's thesis, University of Calgary, Calgary, Canada). Retrieved from https://prism.ucalgary.ca.
dc.identifier.urihttps://hdl.handle.net/1880/119307
dc.language.isoen
dc.publisher.facultyGraduate Studies
dc.publisher.institutionUniversity of Calgary
dc.rightsUniversity of Calgary graduate students retain copyright ownership and moral rights for their thesis. You may use this material in any way that is permitted by the Copyright Act or through licensing that has been assigned to the document. For uses that are not allowable under copyright legislation or licensing, you are required to seek permission.
dc.subjectfederated learning
dc.subjectcarbon emission
dc.subjectmodel compression
dc.subjectordered dropout
dc.subject.classificationArtificial Intelligence
dc.subject.classificationEngineering--Electronics and Electrical
dc.subject.classificationComputer Science
dc.titleCarbon-aware Federated Learning with Model Size Adaptation
dc.typemaster thesis
thesis.degree.disciplineEngineering – Electrical & Computer
thesis.degree.grantorUniversity of Calgary
thesis.degree.nameMaster of Science (MSc)
ucalgary.thesis.accesssetbystudentI do not require a thesis withhold – my thesis will have open access and can be viewed and downloaded publicly as soon as possible.
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
ucalgary_2024_abbasi_ali.pdf
Size:
1.44 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.62 KB
Format:
Item-specific license agreed upon to submission
Description: