0% found this document useful (0 votes)
52 views8 pages

Curve Fitting for Climate Analysis

asshidjdc

Uploaded by

ASHISH SHARMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views8 pages

Curve Fitting for Climate Analysis

asshidjdc

Uploaded by

ASHISH SHARMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

UNIVERSITY INSTITUTE OF ENGINEERING

Subject Name–NMOUP

Subject Code – 21CSH-459

ASSIGNMENT-2

Submitted To: Submitted By:

Faculty Name: Kanika Rana Name: Ashish Sharma


UID: 21BCS7429
Section: 11
Group: B
Case Study: Advanced Polynomial Curve Fitting for Climate Data
Analysis

Objective

This study aims to develop a Python program that applies polynomial interpolation methods,
specifically Lagrange and Newton interpolation, to analyze climate data and model temperature
trends over time. The focus is on fitting these polynomial curves to capture underlying temperature
patterns and comparing the accuracy and statistical efficiency of these methods against least-squares
fitting.

Introduction

Climate change is a critical global challenge, and understanding long-term temperature trends is
crucial for predicting future climate conditions. Polynomial interpolation techniques, such as
Lagrange and Newton methods, are commonly used to approximate data by constructing smooth
curves that pass through a given set of known values. Additionally, the least-squares method is
widely used for curve fitting, aiming to minimize the overall error between observed data points and
the fitted polynomial curve.

In this study, we utilize Lagrange, Newton interpolation, and least-squares fitting to model
temperature trends over time. We evaluate these methods based on their accuracy and efficiency
when applied to historical climate data. Given the nonlinear nature of climate patterns, selecting the
most appropriate fitting technique is critical for accurate predictions.

Methodology

1. Data Collection and Preprocessing

Historical temperature data, sourced from agencies like NOAA or NASA, is used, focusing
on average annual or monthly temperatures. Preprocessing steps include:

o Handling missing data through interpolation or removal.


o Normalizing the temperature data for consistency.
o Splitting the dataset into a training set for model fitting and a test set for validation.
2. Polynomial Interpolation Techniques
o Lagrange Interpolation: Lagrange interpolation constructs a polynomial
P(x)P(x)P(x) that passes through all the given data points. It is defined as:
P(x)=∑i=0nyi⋅li(x)P(x) = \sum_{i=0}^{n} y_i \cdot l_i(x)P(x)=i=0∑nyi⋅li(x)

where li(x)l_i(x)li(x) is the Lagrange basis polynomial. Although it provides an exact


fit for small datasets, it becomes inefficient as the dataset grows, leading to issues
like Runge's phenomenon.

o Newton Interpolation: Newton’s divided difference method constructs the


polynomial incrementally. The polynomial P(x)P(x)P(x) is represented as:

P(x)=a0+a1(x−x0)+a2(x−x0)(x−x1)+⋯+an(x−x0)(x−x1)…(x−xn−1)P(x) = a_0 +
a_1(x - x_0) + a_2(x - x_0)(x - x_1) + \dots + a_n(x - x_0)(x - x_1) \dots (x - x_{n-
1})P(x)=a0+a1(x−x0)+a2(x−x0)(x−x1)+⋯+an(x−x0)(x−x1)…(x−xn−1)

The coefficients aia_iai are computed using divided differences. Newton


interpolation is more computationally efficient than Lagrange, particularly for larger
datasets.

3. Least Squares Polynomial Fitting

The least squares method minimizes the sum of squared residuals between the observed data
points and the polynomial approximation. The goal is to find the coefficients aaa of the
polynomial P(x)P(x)P(x) that minimize the following error function:

∑i=0n(yi−P(xi))2\sum_{i=0}^{n} (y_i - P(x_i))^2i=0∑n(yi−P(xi))2

Results

1. Accuracy Comparison:
o The RMSE (Root Mean Square Error) for the least squares method was lower than
for both Lagrange and Newton interpolation methods, especially for larger datasets.
o Lagrange interpolation worked well for small datasets but showed signs of overfitting
with higher-degree polynomials, resulting in higher RMSE values on the test set.
2. Computational Efficiency:
o Lagrange interpolation became computationally expensive as the dataset size
increased due to its global nature.
o Newton interpolation was more efficient but experienced performance drops with
very large datasets.
o Least squares fitting showed consistent performance as it does not depend heavily on
the degree of the polynomial.
3. Visual Analysis:
o Polynomial interpolation, particularly Lagrange, displayed oscillations between data
points (Runge's phenomenon), especially at the edges of the dataset.
o Least squares fitting provided a smoother curve with fewer oscillations, making it
more suitable for larger, noisier datasets.

Discussion

A comparison of Lagrange and Newton interpolation with least-squares fitting reveals that while
interpolation methods are effective for small and less noisy datasets, they become impractical for
real-world climate data as the dataset grows. In contrast, the least squares method is robust, less
prone to overfitting, and more suitable for long-term temperature trend analysis.
Case Study: Real-Time Signal Processing with Polynomial
Interpolation and Fitting

Objective

This study aims to design a system for real-time signal processing using polynomial interpolation
techniques. The focus is on reconstructing signals from sampled data using Lagrange and Newton
interpolation methods. This case study emphasizes noise reduction and curve fitting to achieve
accurate signal representation from noisy sensor data.

Introduction

Real-time signal processing is essential in various modern technologies, including


telecommunications, medical devices, and sensor networks. Sensor data is often noisy and sampled
at intervals, requiring preprocessing to recover the original signal. Polynomial interpolation
techniques, like Lagrange and Newton methods, provide a way to interpolate smooth curves between
discrete points, facilitating signal reconstruction and noise reduction.

This study explores how polynomial interpolation—particularly the Lagrange-Newton method—can


be applied to reconstruct signals from sampled data. It also examines how noise impacts interpolation
accuracy and compares the performance of polynomial interpolation with other noise reduction
methods, such as filtering.

Methodology

1. Data Simulation

The real-time sensor data is simulated by generating a smooth mathematical function, such
as a sine wave, and sampling it at discrete intervals. Gaussian noise is added to these samples
to replicate real-world sensor noise, with the focus being on removing the noise and
accurately reconstructing the signal.

o True Signal (Simulated): A sine wave or other smooth periodic function is used as
the base signal, e.g., f(x)=sin⁡(x)+0.5sin⁡(2x)f(x) = \sin(x) + 0.5
\sin(2x)f(x)=sin(x)+0.5sin(2x).
o Noisy Sensor Data: Gaussian noise is added to the sampled data, where the noisy
signal is represented as ynoisy=f(x)+noisey_{noisy} = f(x) + \text{noise}ynoisy
=f(x)+noise.

2. Interpolation Methods
o Lagrange Interpolation: Lagrange interpolation constructs a polynomial that passes
through all sampled data points. However, it becomes inefficient and can lead to
oscillations (Runge's phenomenon) as the number of points increases.

o Newton Interpolation: Newton’s divided difference method builds the polynomial


incrementally and is generally more computationally efficient than Lagrange
interpolation, especially for larger datasets. It also reduces oscillations compared to
Lagrange, but is still affected by noise.

3. Noise Reduction via Polynomial Fitting

Polynomial curve fitting is used to reduce noise by fitting a lower-degree polynomial to the
noisy data. The effectiveness of different polynomial degrees in achieving a smooth, accurate
signal reconstruction is evaluated, ensuring the model does not overfit the data.

4. Signal Reconstruction Process


o Sampling: Generate sensor data at discrete intervals (sample points).
o Interpolation and Fitting: Apply Lagrange and Newton interpolation methods to
the sampled data, and fit polynomials of varying degrees to reduce noise.
o Comparison with Filtering: A low-pass filter is applied to the noisy signal as a
baseline noise-reduction technique. The results of polynomial fitting are compared
with filtering in terms of signal accuracy and smoothness.
5. Evaluation Metrics
o Root Mean Square Error (RMSE): Measures the accuracy of the fitted polynomials
on the test dataset.
o Computational Efficiency: Measures the time taken by each method to fit the data.

Results

1. Lagrange Interpolation: Lagrange interpolation reconstructs the signal accurately at the


sampled points but tends to oscillate heavily between points due to noise, particularly at the
edges. This results in a high RMSE, especially in the presence of noise.
2. Newton Interpolation: Newton interpolation offers similar accuracy to Lagrange but is
more computationally efficient for larger datasets. However, like Lagrange, it struggles with
noise, showing oscillations, though to a lesser extent.
3. Polynomial Fitting: Lower-degree polynomial fitting (e.g., degree 3 or 4) produces a
smoother signal reconstruction with significantly lower RMSE compared to Lagrange or
Newton interpolation. It also performs better in terms of noise reduction, without overfitting
the data.
4. Comparison with Low-Pass Filtering: The low-pass filter effectively reduces noise but
also smooths out fine details of the signal. In contrast, polynomial fitting retains more signal
detail while providing noise reduction, making it a good balance between noise suppression
and signal fidelity.

Discussion

This study shows that while Lagrange and Newton interpolation methods can accurately reconstruct
signals from sampled data, they tend to overfit in the presence of noise, causing significant
oscillations between data points. These oscillations reduce the accuracy of the reconstructed signal,
especially in real-time noisy environments.

Polynomial fitting, on the other hand, provides a more effective solution for signal reconstruction
and noise reduction. By fitting a lower-degree polynomial to the data, it is possible to mitigate the
effects of noise while preserving the overall trend of the signal. However, selecting the appropriate
polynomial degree is crucial to avoid underfitting or overfitting the data.

You might also like