Online fraud is an unnecessary headache for online merchants. According to CyberSource Corp. online merchants were estimated to lose $3.0 billion from fraud. Although this is a small percentage of revenue which in 2006 was 1.4%, it is the manual review process that hurts the most. CyberSource estimates that 81% of merchants are manually checking orders. The average rate of manual review exceeds 1 out of 4 orders.
How should business tackle online fraud?
Typically, depending on the size of the business, a three-pronged strategy is applied to tackle fraud, namely - (a) rule-based (2) neural network algorithms (3) data mining techniques. The challenges of the available technologies are (1) all the techniques are based on historical data (2) they are unable to detect fraud as they occur (3) neural network and data mining techniques are hard to implement and are expensive.
One of the best ways of tackling online fraud is to catch the fraudulent transaction as the transactions happen – in real-time. Businesses have a variety of techniques and software at their disposal to implement real-time fraud check. However, since such rules are CPU intensive, they increase the latency of processing the transaction. More rules lead to non-linear increase in processing time. At the same time, the budget to process a payment transaction cannot exceed the limits set by the Networks (such as Visa and MasterCard).
Hence what business needs is a low-latency payment processing software that can authorize transactions as well as process multiple rule-based fraud techniques (such as Address Verification Service, Card Verification Number, Blacklist check, Velocity monitoring and more) in real-time without impacting the time budget for authorizing a transaction. The good news is that such payment software is available today based on next-generation application server farmework that authorize thousands of transactions per second in single digit millisecond latency.
Friday, November 23, 2007
Book Review: The Little Book that makes you rich – by Louis Navellier

Although I am skeptical of those self-help books such as “How to be a millionare” this book does a good job covering the key topics involved in growth investing – number-based fundamental analysis, risk management and portfolio planning. It is a short book and is easy to read.
The key point to take-home is to focus on numbers as opposed to stories. Numbers tell the truth. Wall Street and company management is often selling stories but the numbers are numbers which cannot hide anything but the facts. Louis Navellier covers 8 fundamental variables that are a good measure of how the company is doing. . Thanks to the author, he has provided access to his exclusive stock rating tool that gives qualitative rating to each fundamental variable for over 5000 stocks in his database.
http://www.getrichwithgrowth.com/
One point he calls out is that his weighing of the fundamentals vary from time to time. And how he does that would be one of his secret sauces. Another interesting topic he touches upon is alpha. According to him, the key driver behind alpha is the institutional buying. He claims that today most pundits and gurus incorrectly use the term high-alpha stocks. Since this is an important differentiator I expected him to give more coverage on this topic through examples. An important point I liked is the fact that it is not enough to have strong fundamentals; if there is no buying pressure then the stock may not move for a significant period of time.
All in all, it’s an easy-to-read book that covers the essentials for growth investing.
The key point to take-home is to focus on numbers as opposed to stories. Numbers tell the truth. Wall Street and company management is often selling stories but the numbers are numbers which cannot hide anything but the facts. Louis Navellier covers 8 fundamental variables that are a good measure of how the company is doing. . Thanks to the author, he has provided access to his exclusive stock rating tool that gives qualitative rating to each fundamental variable for over 5000 stocks in his database.
http://www.getrichwithgrowth.com/
One point he calls out is that his weighing of the fundamentals vary from time to time. And how he does that would be one of his secret sauces. Another interesting topic he touches upon is alpha. According to him, the key driver behind alpha is the institutional buying. He claims that today most pundits and gurus incorrectly use the term high-alpha stocks. Since this is an important differentiator I expected him to give more coverage on this topic through examples. An important point I liked is the fact that it is not enough to have strong fundamentals; if there is no buying pressure then the stock may not move for a significant period of time.
All in all, it’s an easy-to-read book that covers the essentials for growth investing.
Wednesday, November 21, 2007
Results = Strategy + streamlined organization + accountability
Define strategy, implement best organization structure based on available resources to execute on the strategy, hold people accountable against objectives and measure results.
It is surprising how many companies or departments within companies struggle to deliver results. Either the strategy is not clearly understood by one and all, or the organization structure is not streamlined or people are not held accountable.
First step is to ask if the strategy is well-understood. Can the employees communicate what the strategy is? Simple is better and is very powerful. If it is not simple, it will not be understood, cannot be communicated by all and therefore cannot be implemented. I remember reading Peter Drucker’s Adventure of a Bystander where he talks about how Alfred Sloan asked his executive team if the strategy is understood by the shop-floor employees. When one executive questioned the need, Mr. Sloan responded that it is the shop-floor employees who would finally execute on the strategy and if they do not understand the strategy then it is destined for failure. Yes, it has to be simple and easily understood.
Second is to have an organization structure that is optimized to execute on the strategy. There is no one-size fits all. The organizational structure and the balance of the leadership team of a company selling technology gears versus products versus solutions will differ. On one end of the spectrum lie technology companies who will have an overload of techies in the management team and potentially run by technical folks who are also competent in sales. On the other end of the spectrum are the solutions selling companies where sales and marketing departments play dominant role. Organization structure is also a function of company maturation. It is typical to have small companies organized by functional areas such as sales, marketing, development, etc whereas large companies are structured around business units with profit and loss responsibilities.
Finally, it is about accountability. Measure, align incentives, and optimize based on results! Measure each department against stated objectives and desired outcomes. Measurement has to be quantitative. It is important to take subjectivity out of the equation, as much as possible. Sales results are visible and immediately quantifiable. As one moves from the sales spectrum to the engineering and development spectrum the measurement becomes less quantifiable. For example, for those products that are only part of a solution and not directly sold, how would an organization measure the effectiveness of the product managers of those products? Some measurement metrics can always be established based on the context of the problem domain. Next is the alignment of incentives. It makes sure that the engine is well-oiled.
Only when the actual results can be compared against the desired outcome, management and all can optimize the entire process for maximum organizational efficiency and achieve best results.
It is surprising how many companies or departments within companies struggle to deliver results. Either the strategy is not clearly understood by one and all, or the organization structure is not streamlined or people are not held accountable.
First step is to ask if the strategy is well-understood. Can the employees communicate what the strategy is? Simple is better and is very powerful. If it is not simple, it will not be understood, cannot be communicated by all and therefore cannot be implemented. I remember reading Peter Drucker’s Adventure of a Bystander where he talks about how Alfred Sloan asked his executive team if the strategy is understood by the shop-floor employees. When one executive questioned the need, Mr. Sloan responded that it is the shop-floor employees who would finally execute on the strategy and if they do not understand the strategy then it is destined for failure. Yes, it has to be simple and easily understood.
Second is to have an organization structure that is optimized to execute on the strategy. There is no one-size fits all. The organizational structure and the balance of the leadership team of a company selling technology gears versus products versus solutions will differ. On one end of the spectrum lie technology companies who will have an overload of techies in the management team and potentially run by technical folks who are also competent in sales. On the other end of the spectrum are the solutions selling companies where sales and marketing departments play dominant role. Organization structure is also a function of company maturation. It is typical to have small companies organized by functional areas such as sales, marketing, development, etc whereas large companies are structured around business units with profit and loss responsibilities.
Finally, it is about accountability. Measure, align incentives, and optimize based on results! Measure each department against stated objectives and desired outcomes. Measurement has to be quantitative. It is important to take subjectivity out of the equation, as much as possible. Sales results are visible and immediately quantifiable. As one moves from the sales spectrum to the engineering and development spectrum the measurement becomes less quantifiable. For example, for those products that are only part of a solution and not directly sold, how would an organization measure the effectiveness of the product managers of those products? Some measurement metrics can always be established based on the context of the problem domain. Next is the alignment of incentives. It makes sure that the engine is well-oiled.
Only when the actual results can be compared against the desired outcome, management and all can optimize the entire process for maximum organizational efficiency and achieve best results.
Wish there was a BRIC currency ETF

Dollar has been in a free fall. Against, the Euro, it has depreciated by 20% over the last two years. Over the same time frame, the dollar has depreciated 20% against the Brazilian Real, 13% against the Russian Rouble, 15% against the Indian Rupee, 9% against the Chinese Yuan. In the long run, however, it’s the BRIC currencies against which the dollar will depreciate the most. Just like many currency ETFs that already exist, I wish there was a BRIC ETF that allowed us to take long positions against the BRIC currencies.
First, there are the macroeconomic fundamentals that are driving growth of these nations and thereby causing the currency to appreciate. In fact, the year of 2007 may prove to be the year of divergence where the BRIC nations take the baton for driving global growth from the developed nations.
Second, currencies of China and India are managed float. They are not free float. Which means when they would be allowed to float, it would get a quick bump to those currencies.
Finally, the sovereign funds have been big buyers of the BRIC currencies. According to Stephen Jen, chief currency economist at Morgan Stanley, the sovereign wealth funds have been an important driver of currencies in emerging markets, particularly the BRIC countries.
I hope that a BRIC currency ETF is introduced soon.
First, there are the macroeconomic fundamentals that are driving growth of these nations and thereby causing the currency to appreciate. In fact, the year of 2007 may prove to be the year of divergence where the BRIC nations take the baton for driving global growth from the developed nations.
Second, currencies of China and India are managed float. They are not free float. Which means when they would be allowed to float, it would get a quick bump to those currencies.
Finally, the sovereign funds have been big buyers of the BRIC currencies. According to Stephen Jen, chief currency economist at Morgan Stanley, the sovereign wealth funds have been an important driver of currencies in emerging markets, particularly the BRIC countries.
I hope that a BRIC currency ETF is introduced soon.
Thursday, November 15, 2007
Short-term dollar rally?

A while back it was the financial press that was negative on the dollar. These days, it’s the turn of the regular press. If everyone so negative on the dollar, the contrarian thinking would expect a dollar rally.
Last week, the press was all over the news that Gisele Bundchen would only accept Euros for her work because the U.S. dollar is "too weak." On Thursday, the tourism ministry in India issued orders that foreign tourists visiting ticketed monuments and heritage sites in India would need to pay in rupees instead of dollars.
Although in the longer-term, the fundamentals are stacked up against the greenback, in the short-term a dollar rally may be in hand. This may result from a global sell-off. If the global stock market sees further routs, the cash will flow into the safe havens of the US Treasuries thereby driving the dollar up.
Last week, the press was all over the news that Gisele Bundchen would only accept Euros for her work because the U.S. dollar is "too weak." On Thursday, the tourism ministry in India issued orders that foreign tourists visiting ticketed monuments and heritage sites in India would need to pay in rupees instead of dollars.
Although in the longer-term, the fundamentals are stacked up against the greenback, in the short-term a dollar rally may be in hand. This may result from a global sell-off. If the global stock market sees further routs, the cash will flow into the safe havens of the US Treasuries thereby driving the dollar up.
Monday, November 12, 2007
XTPP – eXtreme Transaction Processing Platform
With maturing markets, Gartner has retired some of the mature magic quadrants such as Application Servers, High-Performance Computing and others. This is a result of maturation of the market, vendor consolidation and the rise of some specialized platforms. One such specialized platform is termed as eXtreme Transaction Processing Platform.
In an always on lifestyle, demand for real-time applications is on the rise. Although many of the transactions happen in real-time, the process behind the scene is based on batch architecture. To enables end-to-end processing of transactions in real-time, existing transactional applications will go through dramatic change and therefore require new architecture to meet the demands of for real-time applications.
Today’s high-demand transactional systems use transaction processing monitors. Low-demand transactional systems are implemented using traditional application servers. XTPP will be an alternative to such TPM and App Servers to implement next generation, real-time, high-demand transactional systems.
If traditional transaction processing monitors delivered on performance, scalability and availability and if existing application servers delivered on productivity-oriented applications, XTP platforms would delivers on both fronts. On one hand it meets the needs of extreme performance, unparalleled scalability and continuous availability and on the other hand it supports programming models that delivers on business agility.
In an always on lifestyle, demand for real-time applications is on the rise. Although many of the transactions happen in real-time, the process behind the scene is based on batch architecture. To enables end-to-end processing of transactions in real-time, existing transactional applications will go through dramatic change and therefore require new architecture to meet the demands of for real-time applications.
Today’s high-demand transactional systems use transaction processing monitors. Low-demand transactional systems are implemented using traditional application servers. XTPP will be an alternative to such TPM and App Servers to implement next generation, real-time, high-demand transactional systems.
If traditional transaction processing monitors delivered on performance, scalability and availability and if existing application servers delivered on productivity-oriented applications, XTP platforms would delivers on both fronts. On one hand it meets the needs of extreme performance, unparalleled scalability and continuous availability and on the other hand it supports programming models that delivers on business agility.
Friday, November 9, 2007
Transaction processing is back in vogue
Transaction processing was the in-thing in the 80s that saw the rise of IBM transaction-processing monitors such as CISC and IMS platforms and Transaction Processing Facility (TPF). The TPFs were the platform of choice for supporting online transaction-processing applications (OLTP). OLTP applications support large number of concurrent users and update large shared database. In the 90s the phrase transaction processing became a passé. Since 2000s, there has been a dramatic rise in transaction processing resulting from the explosion in eCommerce. And also, the transaction started being accessed and processed from multiple delivery channels as opposed to a single channel.
Suddenly transaction processing is back in fashion. Traditionally, transaction processing has been the domain of mainframes that run big businesses across the globe. And these have been viewed as back-office functions that are matured business processes and therefore do not require much changes. However, the rise of the Net and eCommerce and now mCommerce have brought the mundane transaction processing to the forefront. The traditional transactional systems are under strain. First of, it has to process millions of concurrent transactions in a couple of seconds (max). These transactions can originate from across the globe. Second, these transactions transcend multiple delivery channels and multiple applications unlike in the past where each transaction type used to originate from a single channel and get processed by a single application system. Third, to keep up with changing consumer and business demands and to serve wider geographies and broader demographics, the transaction systems need to undergo rapid changes to keep up with the changing needs. Changing times leads to changing needs. The old and matured transaction processing systems are now at the forefront of IT transformation so that businesses can keep up with changing time.
A big challenge, indeed, for the IT manager of those systems!
Suddenly transaction processing is back in fashion. Traditionally, transaction processing has been the domain of mainframes that run big businesses across the globe. And these have been viewed as back-office functions that are matured business processes and therefore do not require much changes. However, the rise of the Net and eCommerce and now mCommerce have brought the mundane transaction processing to the forefront. The traditional transactional systems are under strain. First of, it has to process millions of concurrent transactions in a couple of seconds (max). These transactions can originate from across the globe. Second, these transactions transcend multiple delivery channels and multiple applications unlike in the past where each transaction type used to originate from a single channel and get processed by a single application system. Third, to keep up with changing consumer and business demands and to serve wider geographies and broader demographics, the transaction systems need to undergo rapid changes to keep up with the changing needs. Changing times leads to changing needs. The old and matured transaction processing systems are now at the forefront of IT transformation so that businesses can keep up with changing time.
A big challenge, indeed, for the IT manager of those systems!
Tuesday, November 6, 2007
Green-tech: Application designer beware!

Today’s enterprise IT organization is faced with an interesting dilemma – the need to scale for business growth but limited or no space for data-center expansion. Provided below is a link that highlights the challenge data-centers in London are facing – lack of space for expansion.
http://www.finextra.com/community/fullblog.aspx?id=338
Gartner believes that green theme will become a decision criterion for IT over the next five years.
From an application design standpoint this can be addressed by looking into how much space hardware requires and the hardware footprint application software requires. Hardware vendors have addressed the space problem by migrating to blades architecture. Application software remains the long pole to this space problem. Not many software solutions are designed from a green standpoint. Most commercial software designed on the principles of n-tier architecture requires more servers than what would be optimal to deliver a business solution. Grid and virtualization software are a step in the direction of green-IT. However, future application design needs to answer the question upfront on how many servers would be required to meet certain scalability requirements. Both space and power consumption constraints would impose design considerations for application designers.
Power consumption is becoming an issue.
N-tier architecture is a design paradigm right out of computer science design classes – an elegant architecture that is modular and therefore flexible. However, most of these applications which are supposed to scale to hundreds of thousands of users and thousands of concurrent users solve the scalability issue by throwing hardware at the problem. This combination of the design principles of n-tier architecture and the needs for scalability leads to server sprawl and therefore an “environmentally unsustainable” IT infrastructure.
New application design should take into account both scalability and “environmentally sustainable” IT upfront in designing next-generation applications.
http://www.finextra.com/community/fullblog.aspx?id=338
Gartner believes that green theme will become a decision criterion for IT over the next five years.
From an application design standpoint this can be addressed by looking into how much space hardware requires and the hardware footprint application software requires. Hardware vendors have addressed the space problem by migrating to blades architecture. Application software remains the long pole to this space problem. Not many software solutions are designed from a green standpoint. Most commercial software designed on the principles of n-tier architecture requires more servers than what would be optimal to deliver a business solution. Grid and virtualization software are a step in the direction of green-IT. However, future application design needs to answer the question upfront on how many servers would be required to meet certain scalability requirements. Both space and power consumption constraints would impose design considerations for application designers.
Power consumption is becoming an issue.
N-tier architecture is a design paradigm right out of computer science design classes – an elegant architecture that is modular and therefore flexible. However, most of these applications which are supposed to scale to hundreds of thousands of users and thousands of concurrent users solve the scalability issue by throwing hardware at the problem. This combination of the design principles of n-tier architecture and the needs for scalability leads to server sprawl and therefore an “environmentally unsustainable” IT infrastructure.
New application design should take into account both scalability and “environmentally sustainable” IT upfront in designing next-generation applications.
Subscribe to:
Posts (Atom)