Nucleosynthesis, Photosynthesis and Global Warming (Part 2)

On how coal was formed

In my previous blog we talked about Global warming, rising CO2 levels and stellar nucleosynthesis. Now, let us see how coal came into existence and the possible repercussions of a less-than efficient usage of this precious and diminishing energy source. During the early history of Earth, it is believed that the atmosphere had a higher concentration of greenhouse gases such as Carbon dioxide and methane. Free oxygen did not yet exist in the atmosphere. Then Cyanobacteria, a phylum of bacteria which began producing oxygen by photosynthesis, appeared. Cyanobacteria converted the early reducing atmosphere into an oxidizing one.

Evidence suggests that a major environmental change occurred around 2.3 billion years ago. This was the Great Oxygenation Event, also called the Oxygen Catastrophe or Oxygen Crisis. The GOE was the point when excess free oxygen started accumulating in the atmosphere. Free oxygen is toxic to anaerobic organisms, and the rising concentrations could have been responsible for wiping out most of the Earth’s anaerobic inhabitants at that time. Eventually however, aerobic organisms began to evolve, consuming oxygen and bringing about equilibrium and creating free oxygen, an important constituent of the atmosphere. Free oxygen helped in oxidizing atmospheric methane to carbon dioxide and water, resulting in the mixture of gases in the atmosphere today.

Complex life started appearing around the start of the Cambrian period, 541 million years ago, when complex multi-cellular organisms became more common, leading to the Cambrian explosion. While diverse life forms prospered in the oceans, the land however remained comparatively barren.

Terrestrial life was believed to have been well established by around 359 million years ago, corresponding with […]

By |December 17th, 2014|Blogs|0 Comments

Nucleosynthesis, Photosynthesis and Global Warming – Part 1

Global warming – A brief history

Global warming is the increase in the surface temperature of the Earth. According to scientists, the global air and sea temperatures have been rising at the rate of about 0.8°C since the start of the 20th century, with almost two thirds of the increase occurring over the last three decades. Rising sea levels and a change in the volume and pattern of precipitation are clear indicators of the repercussions of global warming. Expanding subtropical deserts, an increase in the frequency of extreme climatic events, acidification of oceans and the disappearance of species due to extinction are other pointers to the impact of the global warming phenomenon.

Increasing concentration of greenhouse gases (GHG) in the earth’s atmosphere are cited for this rise in temperatures. And scientists hold increased human activity responsible for this rise in temperatures. But what exactly are GHGs?

Greenhouse gases or GHGs are gases present in the atmosphere that play a critical role in maintaining the surface temperature that makes the earth fit for life. Some examples of primary GHGs are water vapor, carbon dioxide, methane, nitrous oxide, and ozone. GHGs absorb and emit radiation within the thermal infrared range; without these, the earth’s surface would have been around 33°C colder than the average 14°C that it is now!

GHGs in the right proportion support human life while an excess can have just the opposite impact!

Increasing levels of CO2 in the atmosphere

The industrial revolution, which started in 1750, triggered the use of fossil fuels, leading to large scale clearing of native forests. This pushed up […]

By |December 10th, 2014|Blogs|0 Comments

Roof Top Solar Integration

With the growing installation of solar roof top applications in the distribution grid, large-scale penetration of roof top solar is no longer a distant reality. Some regulators have already given households the nod to connect their solar inverters to the grid. Many others are expected to follow suit thus driving a major change from the traditional unidirectional power flows in the low voltage distribution network.

A large-scale penetration and integration of roof top solar applications is more beneficial to economies where there is a shortage in available generation capacity. For utilities, the benefits range from cheap local alternatives for more costly power generated from fossil fuels to reduced load shedding and better portfolio management. For customers, the role of Prosumer—as they are producers as well as consumers—delivers immediate benefits such as feed-in tariffs and increased supply, which could potentially result in the elimination of costly backups such as batteries and diesel generators. Along with the benefits, there come some challenges for utilities to manage these large-scale penetrations. The challenges include voltage regulation, islanding, voltage imbalance, fault detection and operational safety. An active management of the low voltage distribution network is required for addressing these challenges. Distribution network automation will aid utilities in implementing an active network management. Active network management helps perform local voltage control, detect presence of stray voltages, and enable dynamic network reconfigurations and adaptive relay settings.

Power inverters that are a part of roof top solar applications play a vital role in the network management of a feeder with considerable solar penetration. Their participation in network management can be through autonomous functions, which a utility may want to modify through setting commands. Such autonomous functions are connect/disconnect, […]

By |July 4th, 2014|Blogs|0 Comments

16 years and ticking!!!

It is 16 years since we started this journey, that is Kalkitech. Over these long years, we have tried to travel the road less taken, and make a difference to the industry we are a part of. We have constantly strived to make a difference to how we organize and function as a company as well. This is no small achievement for a company that was originally started not to make money, but to make a difference. Kalkitech has made significant contributions to enabling the Smart Grid while steadily making progress on the financial front as well.

With enabling the Smart Grid and energy optimization as the central theme, we have unified our products, services and solutions strategies to focus on delivering value to end customers. This move will also see all our divisions moving closer to the end customer, focusing on key application areas and use cases that deliver real value. Going forward, solving the problems faced by end customers shall be the ONLY driving force behind all Kalkitech offerings, although the path to end customer may traverse through an OEM or an SI.

Looking ahead, the next three-five years show significant promise for Kalkitech. We have kick-started four major external facing initiatives in 2014, in addition to SYNC-Net in advanced DA, energy efficiency, automation and analytics, with the aim of making each of these initiatives a significant contributor to our revenues. We also plan to launch an internal initiative to drive innovation in cloud and mobile. With SYNC-Net as the communication platform, and cloud and mobile based deployment platforms combined with AD, EE, analytics and automation applications, we believe that Kalkitech will take a commanding position in the […]

By |June 19th, 2014|Blogs|1 Comment

Introduction to Energy Efficiency

Energy is the capacity to do work and all work is equivalent to the raising of a weight, i.e., motion against an opposing force. A compressed or stretched spring can raise a weight. An electric battery can be connected to a motor and used to raise a weight. A lump of coal can be burned in some type of engine and used to do work. All human activity depends on conversion of one form of energy to another, using various processes – biological, chemical, thermal, electrical, mechanical, to name a few.

One of the most useful conversions is that of thermal energy to work – such as in a power plant or an automobile. A heat engine is a device that converts thermal energy to work. The laws of thermodynamics govern these conversions, and place limits on feasibility and efficiency of these conversion processes. Specifically, the efficiency of heat engines has an upper limit dictated by the second law of thermodynamics, as a consequence of which, thermal power plants too have a theoretical maximum efficiency. Unfortunately, it is usually a lot less than 100%. Before we delve deeper into an analysis of heat engine efficiency, it is instructive to think about the nature of heat itself. What exactly is heat?

A widely held belief till the mid-19th century was that heat was some type of imponderable fluid, known as caloric, that was able to do work, as it flowed from hot to cold. Heat was always present in latent or sensible form, was material in nature, and was a conserved physical substance. Even today, we think of a hot object as “containing” more heat than a cold object. Heat “flows” […]

By |June 9th, 2014|Blogs|0 Comments

Standards and Technologies used to achieve interoperability in AMI

In the second blog of the series on AMI trends, we talked about the need for a holistic approach to achieve interoperability. In this final blog, we will dive deep into XML based file format, lower network (NAN), DCU, edge routing scheme, and mesh networking scheme.

The interface between the meter and the HHU is already standardized by the meter standard (usually defined for both the electrical and the optical port of the meter). However, the interface between the HHU and the HES is undefined at present. The nature of the data is a binary-block dump over a local port that can be considered as highly reliable. A standard layered protocol is not required since we do not need to consider packet losses, corruption or retries etc. The most efficient format for this data dump can be a file format, preferably an XML based format which can provide markup on top of the data. A custom-designed XML schema can be defined for this purpose if no standard schemas are available; however, it needs to be vetted for all use cases including the newer smart-grid use cases such as programming parameters loading, tariff writing etc. Such a custom-designed XML format also needs to be passed through a SDO process and released as a standard.

The lower network (NAN) typically offers some choices in the application layer, layered on top of […]

By |June 2nd, 2014|Blogs|0 Comments

Need for holistic approach to achieve Interoperability in AMI industry

In the first blog on recent trends in AMI, we talked about the need for interoperability between devices and solutions and the development of standardized metering protocols. We also said how smart grid industry and utilities manufacturers gradually came to terms with the fact that the universal metering standards will be more of a benefit than a threat. In continuation with the previous blog, we will delve into the holistic approach required to achieve interoperability.

From an age where meters were read manually using hand held units, we are looking at an array of wireless and wired technologies ranging from cellular, low power radio and power line carriers, with architectures ranging from simple radial schemes to complex meshed schemes. In this transport model, ensuring the interoperability of the meter at its reading port is not sufficient.

We now need to take a holistic view of the entire chain from the meter upwards and look at ensuring interoperability at every link in the chain. We also need to consider the fact that many, if not most benefits of a connected smart meter evolve from the “connected” part. A manual transport mechanism can only be seen as a last resort for specific deployments where no other “connected” technologies are possible. To ensure interoperability at each link in each array, utilities need to examine multiple carrier mechanisms before selecting a mechanism for deployment. This concept is best defined as a collection of profiles or stacks – borrowing from the protocol layers terminology, where each stack is defined as closely and tightly as possible but is only one option among many stacks.

Figure 1 represents a […]

By |May 26th, 2014|Blogs|0 Comments

Trends in Advanced Metering Infrastructure (AMI)

The global AMI scene has picked up tremendous momentum in recent years with the development of Smart Grid programs, in which the AMI component is arguably the most significant subset. A recent report estimates that Smart Grid programs will generate cash flow of around USD 140 billion from 2014 to 2022. The incentive for this kind of spending is the possibility of recovering investment and generating profits based on optimizing demand-supply gaps, reducing losses, increasing billing efficiency and accuracy. AMI initiatives also have the potential to deliver calculated benefits from intangibles such as securing the grid against cascading failures and attacks.

The smart grid movement calls for interoperability between devices and solutions from multiple vendors on a global scale. The first step in that direction was the establishment of open international metering specific protocol standards such as DLMS/COSEM and their various countries and regional bodies. At the time, the opinion and mindset of meter manufacturers, steeply immersed in proprietary solutions for decades, was strongly against this imposition from the outside world. This was mainly due to apprehensions about their inability to deliver an end-to-end solution to utilities in the light of different interoperable parts supplied by different vendors constituting the solution. This mindset can be attributed to the long-standing utility practice of investing responsibility for the meters, the hand-held-units and the data collection software with the same party for each individual deployment – a mechanism that both utilities and manufacturers had stabilized over the years and were happy with. However, the winds of change blowing over the utility industry as a whole (not just the metering aspect) dictated new terms – vendor independence through multi-vendor solutions and interoperability through open standards.

By |May 20th, 2014|Blogs|0 Comments

Smart Grid Interoperability Challenge in the last mile and IPV6, IETF, IEC and IEEE standards architecture

Let us begin with my understanding of what interoperability is.

Interoperability is defined in various fora but essentially describes the ease and ability with which different components of a networked system can communicate effectively with each other. A highly interoperable system is one where

  • The different blocks or nodes of the system speak the same language between each pair and do not require translator devices to intermediate
  • Common modeling and naming of the fundamental information that is to be exchanged across the system end-to-end is available. This may not be the same as the model and naming utilized by each communication link or protocol since different standards may be used at different levels. A system wide definition of model and naming can be significantly useful for interoperability to maintain a common anchor for each piece of information
  • The blocks or nodes representing equipment can be easily sourced from multiple vendors without requiring significant rework of the system configuration

Incidentally,   Interchangeability is a related concept which is even more difficult a goal to attain than Interoperability. A system that provides for interchangeability would ensure seamless integration of equipment from multiple vendors, effectively allowing drop-in replacement of different vendor equipment for any specified block in the system without requiring ANY rework of configuration. This is almost an impossible ideal at the current state of technology but specific initiatives are moving towards this goal so that a certain “degree” of interchangeability can be achieved.

Next let us look into the various challenges the industry needs to surmount to achieve interoperability:

Technology Challenge – There are a plethora of technologies that promise various levels of performance, reliability, security, cost and ease […]

By |May 5th, 2014|Blogs|0 Comments

Distribution Management System

Currently distribution operations are getting a huge make over due to the large investments planned under different Government initiatives. The effective operation of the distribution network shall allow better power flow management throughout the network and optimal utilization of sources and loads. DMS system plays an important role in effective operation of distribution system by enhancing routine network monitoring, fault location and restoration; planning the network requirements and further,  it lays down the foundation for future SMARTGRID initiatives.

Usually we can find two types of DMS software in the market:

  •  SCADA centric
  • Stand alone DMS integrated with existing utility systems

There are many vendors and developers providing solutions to distribution managements system and it become the utilities choice to take the solution that suits their requirement, Often this become important when high investments are planned by the utility.

Distribution operations are normally different from the substation operations, where assets are located in a single location and operators have access to each device. In the case of distribution networks, assets are spreads across large geographical area hence locating the device and grouping the operations based on zones are difficult (for example, when multiple operations/ switching plans are to be decided on big outages). Generally, utilities follow operational guidelines (work flows) for distribution network operations, which will ensure proper decision making and reduces accidents while maintenance. So, DMS systems for utilities should have the features to deal with the above requirements.

Most of the SCADA centric DMS solutions are evolved from the vendor’s proven SCADA platforms with native communication protocols and additionally DMS applications are build on top of it. Here the operator can perform DMS operations as mere SCADA […]

By |November 19th, 2013|Blogs|0 Comments