Browsing by Author "Wang, Xin"
Now showing 1 - 20 of 64
Results Per Page
Sort Options
Item Open Access 3-D Cadastral Boundary Relationship Classification Algorithms using Conformal Geometric Algebra(2021-04-26) Pullano, Dillon; Barry, Michael; Wang, Xin; O'Keefe, Kyle; Detchev, Ivan; Barry, Michael; Wang, Xin; Rangelova, ElenaAs urban centers continue to grow and develop, there is an increasing need for institutions to be able to digitally model and perform relationship analysis on 3-D cadastral boundary data. 3-D boundary analysis can be performed through visual inspection of survey plan drawings, but this often requires professional expertise such as a land surveyor or lawyer. This study examined the development, testing, and application of methodological processes and algorithms that were designed to classify various geometrical and topological relationships between the boundary components of two 3-D cadastral units to solve cadastral boundary problems. It applied established mathematical theory using Conformal Geometric Algebra objects and operational techniques, in combination with various 3-D point-point distance evaluations and geometric concepts to the classification of relationships between 3-D cadastral boundaries. A literature search suggests that the theory and methodology as it was applied in this study have not been used to classify topological relationships between 3-D cadastral boundaries elsewhere. Six sets of data flow processing algorithms were developed to determine the relationship classifications between boundary component pair sets that exist between two 3-D cadastral units. The classification processes were first validated using seven simulated experimental testing datasets, each consisting of two cube-like units. The classification processes were then applied to a cadastral dataset that was derived from a condominium survey plan registered in Alberta, Canada. This showed how the methods developed here can be applied to solving a practical 3-D cadastral boundary problem example in the land surveying field, specifically towards validating a shared boundary between two adjacent condominium units as is intended on the plan before survey plan registration. Results from the experimental datasets support the methods that were proposed to classify 53 distinct types of topological relationships between 3-D boundary component pair sets. While this type of boundary relationship analysis can be done through visual inspection of survey plans, the methods developed here are more mathematically rigorous. These processes could be leveraged by land surveyors and land administration professionals when analyzing 3-D survey plan boundaries.Item Open Access A Geospatial Infrastructure to Collect, Evaluate, and Distribute Volunteered Geographic Information for Disaster Management(2016) Poorazizi, Mohammad Ebrahim; Lichti, Derek; Liang, Steve; Wang, Xin; Jacobson, Daniel; Kalantari, MohsenRecent disasters, such as the 2010 Haiti earthquake, have drawn attention to the potential role of citizens as active information producers. By using location-aware devices such as smartphones to collect geographic information in the form of geo-tagged text, photos, or videos, and sharing this information through online social media, such as Twitter, citizens create Volunteered Geographic Information (VGI). This thesis presents a framework for the effective use of VGI in disaster management platforms. The proposed framework consists of four components: (i) a VGI brokering module, to provide a standard service interface to retrieve VGI from multiple social media streams, (ii) a VGI quality control component, to evaluate spatiotemporal relevance and credibility of VGI, (iii) a VGI publisher module, which uses a service-based delivery mechanism to disseminate VGI, and (iv) a VGI discovery component, which acts like a yellow-pages service to find, browse, and query available VGI datasets. A set of quality metrics specifically designed for VGI evaluation is introduced. This research also presents a prototype implementation including an evaluation with social media data collected during Typhoon Hagupit (i.e., Typhoon Ruby), which hit the Philippines during December 2014. The evaluation results suggest that the proposed framework provides a promising solution towards an effective use of VGI in disaster management platforms. Utilization of the proposed quality metrics on the collected VGI database – with multiple social media stream contributions – will allow disaster response teams to make informed decisions that could save lives, meet basic humanitarian needs earlier, and perhaps limit environmental and economic damage.Item Open Access A Study on Efficient Vector Mapping With Vector Tiles Based on Cloud Server Architecture(2015-12-07) shang, xiaohong; Liang, Steve; Wang, Xin; Wang, Ruisheng; kattan, lindaIn web mapping, transmitting large vector data over the Internet has been a challenging issue over the past decade. A method for delivering large vector data in small pieces is known as vector tiling. Generally, studies of traditional vector tile based methods in Web-mapping applications were limited to simple single server-client architecture with GeoJSON encoded vector tiles. However, problems such as limited scalability and inefficient vector tile transmission arose in these studies. To solve these problems, a distributed memory caching implementation has been proposed using cloud architecture. This study also explored the transmission efficiency of three vector tile encoding formats: GeoJSON, TopoJSON, and Google Protocol Buffers. A prototype of the Canada road network vector map was developed. The results of this study show that the proposed solution improves the application performance and is scalable in comparison to naïve architecture.Item Open Access A traffic accident risk mapping framework(2012-06-20) WANG, JING; Wang, XinIdentifying traffic accident concentration area is important for road safety improvements. Previous spatial concentration detection methods did not consider the severity levels of accidents, and the final traffic accident risk map for the whole study area ignores the different users’ requirements. This thesis proposes an ontology-based traffic accident risk mapping framework. In the framework, the ontology represents the domain knowledge related to the traffic accidents and supports the data retrieval based on users' requirements. A new spatial clustering method, called DBCTAR (Density-based Clustering for Traffic Accident Risk), takes into account the numbers and severity levels of accidents is proposed for risk mapping. To demonstrate the framework and the new algorithm, the Ontology-based Traffic Accident Risk Mapping (ONTO_TARM) system and a web-based clustering service GeoClustering have been developed. Four case studies in the city of Calgary with final risk maps are presented and discussed.Item Open Access A traffic accident risk mapping framework(2012) Wang, Jing; Wang, XinItem Open Access Activity-based and Behavior-based Location Recommendation in Location Based Social Networks(2014-01-31) Rahimi, Seyyed Mohammadreza; Wang, XinLocation-Based Social Networks (LBSNs) are social networks with functionalities that let users share their location information with other users. Location recommendation is the task of suggesting unvisited locations to the users. A good location recommender should make user-specific recommendations based on users’ preferences, geographical constraints and time. In this thesis we investigate the development of two novel location recommendation methods for Location-Based Social Networks (LBSNs), the Probabilistic Category-based Location Recommender (PCLR) and the Behavior-based Location Recommender (BLR). The PCLR method finds the temporal and spatial patterns of users’ activities in the form of temporal and spatial probability distributions. It then uses the patterns to select the right category of locations and recommend nearby locations of that type to the user. On the other hand, the BLR method first extracts user behaviors from their check-in history. It then utilizes a collaborative filtering technique to extract common behaviors and predict behavior of the user at a given time. Finally, BLR filters locations in the user’s proximity based on the predicted behavior when making the location recommendation. PCLR and BLR methods go through a set of experiments on a real-world check-in dataset. These experiments show that PCLR and BLR methods improve the performance of the existing location recommenders in terms of precision and recall. Additionally, the BLR method produces much better recommendations for the cold-start users.Item Open Access An Adaptive Land Tenure Information System Database Design for Conflict and Post-Conflict Situations(2017) Dabboor, Alaa; Barry, Michael; Wang, Xin; Detchev, IvanThe key objective of this study was to design, develop, and test a schema-less graph network Land Tenure Information System (LTIS) database prototype that is integrated with data mining and social network analysis techniques for the purpose of revealing hidden tenure information in the data related to conflict and post-conflict situations. Conventional LTISs are ineffective in conflict and post-conflict situations because they only describe recorded tenure information, and therefore these systems are not reflective of land practices taking place on the ground. In conflict and post-conflict situations, multiple sets of state held and privately held land tenure records may exist. The question then is how can LTISs be better designed to capture and describe tenure information in conflict and post-conflict situations. The study adapts a spiral software development model to develop a Talking Titler Network (TTN) database prototype. Simulated tenure data from two illustrative cases was entered and automatically mined and analysed within the database system. The results show that a schema-less graph network database integrated with data mining and social network analysis techniques can capture and describe complex land tenure information among and between people and tenure objects. In addition, the integrated techniques automatically extract, investigate, and visualise embedded tenure information emerging from these situations. The experimental test results provide important empirical observations to advance the TTN database design and development in order to assist in supporting land tenure dispute resolution for conflict and post-conflict situations. However, further field work needs to be carried out to validate the results.Item Open Access Assessing Lightning and Wild Fire Hazard by Land Properties and Cloud to Ground Lightning Data with Association Rule Mining over Alberta, Canada(2017) Cha, Donghwan; Kim, Jeong Woo; Wang, Xin; Liang, SteveCharacteristics of Cloud to Ground (CG) lightning over Alberta, Canada were investigated by using 2010-2016 lightning data with data mining methods. The hotspot analysis was implemented to find the regions with high frequency CG lightning strikes clustered together. Generally, hotspot regions are located in central, central east and south central regions of the study regions. About 94% of annual lightning occurred in warm months (June to August) and the daily lightning frequency was influenced by diurnal heating cycle. The CG lightning frequency associated with land properties was investigated by measuring preference index (PI). The association rule mining technique was used to investigate frequent CG lightning patterns, which were verified by similarity measurement to check the patterns’ consistency. The verification of CG lightning hazard map generated with 2010-2014 data was carried out by comparing it to unprocessed raw CG lightning data from 2015-2016. The similarity coefficient values indicated that there were high correlations throughout the entire study period. The actual CG lightning generally occurred more frequently in higher risky regions in the lightning hazard map. Most wild fire (around 93%) in Alberta occurred in forests, wetland forests and wetland shrub areas. It was also found that lightning and wild fire occur in two distinct areas: frequent wild fire region with a high frequency of lightning, and frequent wild fire region with a low frequency of lightning. Further, preference index (PI) revealed locations where the wild fires occurred more frequently than in other class regions. As one of the potential applications of this research, the wild fire hazard area was estimated with the CG lightning hazard map and specific land use types.Item Open Access Bayesian Calibration for Logit Model Microsimulations: Case for PECAS SD in San Diego(2017) Hill, Graham; Hunt, John Douglas; Kattan, Lina; Dann, Markus; Wang, XinBayesian inference is a versatile method for incorporating new information into a model while still respecting existing knowledge. One application of Bayesian inference is the calibration of models that are controlled by a large number of parameters, but where the data usable for calibration is incomplete or unreliable. Microsimulation models of urban development fit both of these criteria, but calibrating them is further complicated by their non-determinism. I investigated a calibration method called Bayesian Expected Value Calibration, which is designed to overcome non-determinism while incorporating existing knowledge, in the context of the PECAS Space Development model of the San Diego area. The test consisted of creating synthetic data using known behavioural parameters, calibrating the Space Development model to targets derived from the synthetic data and with priors reflecting imperfect existing knowledge, and assessing how closely the calibrated parameters matched the true values. I found that BEVC was generally effective at converging towards the true values of the parameters, and often received meaningful contributions from both the prior knowledge and the new observations under a range of plausible conditions. As would be expected from Bayesian theory, increasing the number of observations or the amount of useful prior knowledge improved the accuracy of the calibration. The method was robust under reasonable levels of human fallibility in creating the priors, and only suffered from significant loss of accuracy under extreme assumptions. However, more sophisticated methods of objectively determining the weights to assign to the data sources did not significantly improve calibration accuracy.Item Open Access Behavior-based and Contextual Location Recommendation for Location Based Social Networks(2021-02-10) Rahimi, Seyyed Mohammadreza; Wang, Xin; Far, Behrouz Homayoun; El-Sheimy, Naser; Liang, Steve H. L.; Uddin, Gias; Wachowicz, MónicaLocation-Based Social Networks (LBSNs) are social networks with functionalities that let users share their location information with other users. A service capable of improving user engagement and bring more check-ins to the Location-based Social Network is location recommendation. Location recommendation is the task of suggesting unvisited locations to the users. An effective location recommendation model makes user-specific recommendations based on users’ preferences, geographical constraints and contextual information such as time and weather. The main question we need to answer to design a location recommendation model is how to effectively utilize different types of information into location recommendation. In this thesis, Behavior-based Location Recommendation (BLR) and Contextual Location Recommendation (CLR) are proposed, these models effectively utilize temporal and other contextual information to produce improved location recommendations. In this thesis we investigate the development of two novel location recommendation methods for Location-Based Social Networks (LBSNs), the Behavior-based Location Recommendation (BLR) and the Contextual Location Recommender (CLR). The BLR finds the temporal and spatial patterns of users’ behaviors in the form of temporal and spatial probability distributions. It then uses the patterns to predict the location type and recommends nearby locations of that type to the user. On the other hand, the CLR method first extracts the responses of the users to contextual triggers using their check-in history. It then utilizes a tensor factorization technique to extract common responses and predict the user response with the given set of contextual triggers. Finally, CLR filters locations in the user’s proximity based on the predicted location type. To find user similarities, both BLR and CLR utilize Random Walk with Restart. To improve the performance of these methods, an optimized random walk with restart method is also proposed that can improve the time complexity of random walk with restart by a factor of at least 6.75. Both BLR and CLR methods go through a set of experiments on a real-world check-in dataset. These experiments show that BLR and CLR methods improve the performance of the existing location recommendation methods in terms of precision and recall. Additionally, both BLR and CLR methods can achieve higher precision and recall values for cold-start users compared to the well-known baseline models.Item Open Access A Biologically Inspired Supervised Learning Rule for Audio Classification with Spiking Neural Networks(2021-06-15) Peterson, Dylan George; Leung, Henry; Westwick, David; Uddin, Gias; Wang, XinAudio classification has many practical applications such as noise pollution monitoring, wildlife monitoring, audio surveillance, speech recognition, and more. For many of these applications, deploying classifiers on low powered devices for persistent monitoring is desirable. Artificial neural networks (ANN) have achieved significant success for audio classification tasks. However, it may not always be feasible to deploy current state-of-the-art ANNs to embedded devices due to their memory footprint and power consumption. Biologically inspired neural networks, also known as spiking neural networks (SNN), have been shown to significantly reduce power consumption during inference when compared with equivalent ANNs. They have also been theoretically proven to be more computationally powerful per unit than ANNs. These two properties make SNNs an attractive solution for machine learning tasks on low powered embedded devices, such as at the edge in an Internet of Things (IoT) sensor network. However, SNNs tend to lag behind in performance when compared to ANNs. This is partially because training SNNs is difficult since the standard backpropagation algorithm is not directly applicable due to the non-differentiable spiking nature of SNNs. Encoding data into spike trains compatible with SNNs is also an unresolved question when applying SNNs. This work compares different spike encoding schemes for audio data, and a learning algorithm for multilayer SNNs inspired by biologically plausible learning rules is developed. The proposed learning rule is then successfully applied to simple pattern recognition and audio classification tasks.Item Open Access Brittleness and Fracability Evaluation of Unconventional Reservoirs(2018-05-10) Hu, Yuan; Chen, Zhangxing (John); Huang, Haiping; Wang, Xin; Hejazi, Seyed Hossein; Nouri, Alireza M.Brittleness and fracability evaluation plays an important role in recovery of unconventional oil and gas; it directly influences the effect of hydraulic fracturing. The definition of brittleness is controversial and the existing analytical/semi-analytical models have no unified theory to support them. Brittleness and fracability evaluation is currently unreliable. Unconventional reservoirs have different confining pressure, pore pressure and temperature. Models that do not consider these influences lack accuracy in the brittleness index (BI) calculation, resulting in failure during hydraulic fracturing. This research is focused on establishing new methods for brittleness and fracability evaluation. First, analytical/semi-analytical models are proposed considering the influence of confining pressure, pore pressure and temperature, respectively. The influence of calcite on rock mechanics parameters and brittleness is compared to quartz and clay. The weight of each parameter in models based on elastic modulus and mineralogy is analyzed. Finally, a numerical method to evaluate rock brittleness in terms of energy is developed. This novel method is applied to evaluate rock brittleness and fracability in more complicated conditions by considering hydro-mechanical (HM) interaction. By defining brittleness in terms of energy, rock brittleness from different sources can be compared. The influence factors ignored by other models of brittleness evaluation: pressure, temperature and rock texture can be addressed at the same time. By combining the analytical method and the numerical method for brittleness and fracability the resulting evaluations are more applicable because they reflect a more realistic unconventional oil and gas reservoirs environment.Item Open Access Calibration, Validation, and Verification of Static Terrestrial Laser Scanning for Professional Land Surveying of 3D Boundaries(2017) Rondeel, Samuel; Barry, Michael; Lichti, Derek; Wang, Xin; Collins, MichaelThis thesis examines the validity of static terrestrial laser scanning self-calibration and measurement procedures within current 3D cadastral surveying law in Canada, Australia and South Africa. It examines methodologies used to validate static terrestrial laser scanning outputs subjected to rigorous cross-examination within professional land surveying missions. Due to the construction and design of current laser scanning systems, the raw measurements are not typically available for analysis by the operator and thus their validity could be scrutinized in a court of law. The objectives are met by reviewing and analyzing typical terrestrial laser scanner measurements and outputs based on the laser scanning system construction, scanning environment, and scanning mission procedures. The results show that while terrestrial laser scanning systems provide invaluable information, they could be scrutinized if the proper procedures are not followed. However, the results also suggest that the complimentary methods of terrestrial laser scanning and total station measurements provide the most rigorous results when defining 3D boundaries.Item Open Access Carbon-aware Federated Learning with Model Size Adaptation(2024-07-23) Abbasi, Ali; Drew, Steve; Wang, Xin; Far, Behrouz; Moshirpour, MohammadDeveloping machine learning models heavily depends on the availability of data. Establishing a responsible data economy and safeguarding data ownership are essential to facilitate learning from distinct, heterogeneous data sources without centralizing data. Federated learning (FL) provides a collaborative framework that enables model development using data from geographically distributed clients, each characterized by unique carbon footprints associated with varying energy sources which can lead to significant carbon emissions when learning from these decentralized data from edge clients like smart phones and IoT devices. This variability in carbon intensity poses a substantial challenge in striving for environmentally sustainable model training with minimal carbon emissions. This thesis introduces innovative carbon-aware strategies within FL to mitigate total carbon emissions through strategic client engagement and resource allocation. We propose two distinct methods: (1) clustering clients based on data distribution and offsetting high carbon emissions with those exhibiting lower emissions, implemented through a client similarity matrix (FedGreenCS), and (2) adapting model sizes based on the carbon intensity of client locations (FedGreen), employing model compression techniques. Our results affirm the effectiveness of both approaches in harmonizing model performance with environmental impact, underscoring their potential as sustainable solutions in distributed learning scenarios. We conduct a theoretical analysis of the trade-offs between carbon emissions and convergence accuracy, taking into account the carbon intensity disparities across different regions to optimally select parameters. Empirical studies reveal that model size adaptation significantly reduces the carbon footprints of FL, surpassing contemporary methods while maintaining competitive accuracy. This research also highlights the viability of client selection and model adaptation as sustainable strategies in distributed learning contexts.Item Open Access Characterization and Modelling of Stormwater for the City of Calgary(2017) Shrestha, Dhiraj; He, Jianxun (Jennifer); Chu, Angus; Wang, XinThe quantification of pollutant loading from nonpoint pollution sources is very challenging but crucial. Statistical analyses were performed for identifying differences of stormwater quality among different types of land use and among catchments of same land use in three types of flow (baseflow, snowmelt and stormwater runoff). Results indicate water quality parameters present variations among different types of land use and among catchments of same land use. In addition, Stormwater Management Model (SWMM) was calibrated and verified for industrial and residential land uses. The modeling results clearly demonstrate distinct coefficient values for pollutant build-up and wash-off. Rainfall, as the source of stormwater, in the city was also investigated to characterize the spatial and temporal distribution of rainfall. The identified differences in stormwater quality from statistical analysis and modeling suggest the need of quantifying and modeling pollutant loading from different types of land use.Item Open Access Development of Adaptable Products Based on Modular Design and Optimization Methods(2016) Martinez Barragan, Maribel; Xue, Deyi; Ramirez-Serrano, Alejandro; Li, Simon; Wang, XinAn adaptable product is a product that can be reconfigured or upgraded to satisfy different requirements. Among various advanced design methods, modular design approach is employed in this research for the design of adaptable products. A module in a product is a group of components that can be disassembled non-destructively from the product as a unit. In the traditional modular design approach, components of a product are grouped into modules based on similarity among their functions and/or manufacturing processes. This traditional approach does not consider that the product may have to be modified or upgraded due to a change in the requirements during the operational stage of the product. This results in a problem since the part that needs to be modified to satisfy the new requirement cannot be solely replaced and the entire module where it belongs to has to be replaced. The objective of this research is to improve the adaptable design method by developing a module design approach considering the different life-cycle properties of the components in the adaptable product. In addition, optimization is used to identify the optimal design of adaptable product. In this research, the product description in different life-cycle phases is modeled by different configurations, and each of these configurations is described by a set of parameters. The product components with similar life-cycle properties such as maintenance frequency, life-span, degradation of performance, etc. are grouped into modules. A hybrid AND-OR tree is used to model all feasible design solutions considering different configurations with their corresponding parameters at different life-cycle phases. The adaptable product at a certain life-cycle time point is evaluated by a number of evaluation measures which can have different measurement units. The evaluation measures in different units are converted into comparable evaluation indices. The overall evaluation index for an adaptable product is defined by individual evaluation indices and their importance weighting factors considering the whole product life-cycle span. A multi-level optimization method is employed to identify the best design solution, its configurations in different life-cycle phases and parameter values of the relevant configurations. A case study is implemented to demonstrate the effectiveness of the developed new adaptable design approach.Item Open Access Development of an Improved Repeater-free Acoustic Telemetry System Through Experimental Investigation and Modelling(2021-11-23) Pagtalunan, Jediael R.; Park, Simon; Kim, Seonghwan; Xue, Deyi; Wang, XinMeasurement while drilling (MWD) enables real-time measurement of downhole conditions for directional drilling, but most commercial MWD telemetry techniques such as mud pulse or electromagnetic methods suffer from limited transmission speeds. Acoustic telemetry has the potential for significantly faster transmission speeds, albeit with limited range due to drill string attenuation and noise. A common solution to this is to use acoustic repeaters, which incur high costs and require complex implementation. Instead of using repeaters, we utilized two carrier frequencies at the modes to transmit redundant data in combination with a lock-in amplifier (LIA) to extract the signals from the carriers. The extracted signals were then fused at the receiver to increase signal fidelity. An experimental setup was developed to transmit acoustic signals through a simulated drill string. The signals were first attenuated by the rubber section of the simulated drill string. The results show that the proposed system was able to achieve error-free transmission of packets at 64 bps up to 1.95 km without the use of a repeater which is an order of magnitude faster than current commercial MWD methods.Moreover, the acoustic telemetry system requires the identification of the carrier frequencies near the natural frequencies of drill string with specified boundary conditions. This work proposes a finite element (FE) model based on the Timoshenko beam theory that predicts the dynamics of an actual drill-string over a wide frequency range. The frequency response of the model is compared to models in literature with similar components. Then, three configurations that follow a specified trajectory are defined with increasing lengths and curvature to represent the drill string assembly as it approaches the target reservoir. The frequency responses of the of the three configurations are determined and a carrier frequency was selected at the center of the third passband. Like the lab-scale experiments, packets of bits are first generated as telemetry data and then convolutionally encoded to reduce errors at the receiver. The signal is modulated using differential binary phase shift keying (DBPSK) and upconverted to the carrier frequency which is used as the force input to the model. Finally, the receiver at the surface demodulates and decodes the received acceleration to recover the transmitted bits using a digitally implemented LIA. The transmitted and received bits are again compared to calculate the bit-error rate (BER) for each signal-to-noise ratio (SNR) condition and used as the measure of performance. To simulate transmission, the time impulse responses are first recorded for the three different drill string configurations. These are then used to develop a finite impulse response filter (FIR) for simulation of the acoustic transmissions. The results show that the passband locations stay at the same frequencies and that transmission speed is limited by the passband widths.Item Open Access Discovering Relationships between Reservoir Properties and Production Data for CHOPS Using Data Mining Methods(2016-01-15) Wang, Xi; Mahinpey, Nader; Wang, Xin; Dong, Mingzhe; Chen, Shengnan (Nancy)Cold Heavy Oil Production with Sand (CHOPS) produces sand, and greatly contributes to primary oil recovery. It’s generally believed that wormholes, resulting from sand flow, enhance oil recovery in this process. However, due to complexity and variability, it’s difficult for wormhole models to precisely describe how wormholes develop within the formation. In this study, we regard wormholes as an integral black box. We apply data mining methods to explore how the reservoir attributes influence the CHOPS wells production. Gain ratio is used to rank and select the most important attributes for oil production. For overall oil production performance, cumulative porosity, cumulative oil saturation, effective thickness, and average shale content are the most important and relevant attributes. Decision trees constructed by C4.5 algorithm provide details of how to classify oil production instances according to reservoir attributes. All the correctly classified rates are over 55%, which is reliable accuracy in our results.Item Open Access Empirical Models for Estimating Significant Wave Height Using RADARSAT-2 Data(2019-01-08) Ma, Meng; Collins, Michael J.; Collins; Wang, Xin; Shahbazi, Mozhdeh M.The images captured by Synthetic Aperture Radar (SAR) are useful for retrieving ocean wave parameters. The objective of this study is to establish empirical algorithms to estimate significant wave height (Hs) from RADARSAT-2 (RS2) fine-quad (FQ) beam mode images. This is the first time to retrieve Hs by utilizing around 1,400 collocated samples over the eastern and western North America. The fitting procedures are based on linear regression and neural network. The models are validated against to buoy observations. Unlike the most related work, this study explores the effects of incidence angle and polarization on the estimation of Hs. I report five accuracy metrics and introduce a novel cost function to assess and compare the models' performance. Finally, the proposed two types of models both show a good agreement with buoy observations. The RMSE and R could reach to 0.26m and 0.97, respectively.Item Open Access An Evaluation of Low Fidelity Prototyping Techniques in Agile Release Planning for Collocated Teams(2008-06-19T15:56:28Z) Greenberg, Saul; Ghanam, Yaser; Wang, XinIn an Agile environment where the requirement elicitation is a continuous process, low-fidelity prototypes are increasingly important. Collocated Agile teams often use the traditional whiteboard to draw these prototypes and discuss it with the customer in the release planning meeting preceding each iteration. For the same purpose, SMART Boards are also being utilized by some Agile teams. A study to compares prototyping using both tools was conducted in an academic setting showing an equal preference for both tools with the whiteboard being perceived better in terms of readability and the SMART Board being deemed a better means of sharing. With tools having different pros and cons, it was suggested that both tools can be utilized in release planning meetings to do different kinds of tasks or to accommodate different room settings.