Subset polynomial regression is more flexible than full polynomial regression for modeling data. If the subset polynomial regression is fitted to the data, then the parameters are generally unknown. This paper proposes a method for selecting the subset polynomial regression where the order is unknown. The method used to estimate the parameters of the subset polynomial regression is the Bayesian method. However, the Bayesian estimator is analytically not able to be found. To solve these problems, the reversible jump MCMC algorithm is proposed. The key of this algorithm is the producing of the Markov Chain that converges to the limit distribution of the posterior distribution.
Role of Information Communication Technology (ICT) in Inventory Management of Small to Medium Enterprises (SMEs): A Case Study of Chikwanha Business Centre in Chitungwiza, Zimbabwe
Rugare Chitiga and Farai Choga
The Zimbabwean economy declined in the past decades. Many industries closed leading to mushrooming of the informal sector. Chikwanha has developed into a hub of many SMEs involved in different activities. No study has been carried out to ascertain how ICT has influenced the stock or inventory management of SMEs activities at Chikwanha. This research study was aimed at determining the extent of ICT influence in stock management. A qualitative approach was used. Interviews and questionnaires were used in data generation. The findings showed that Internet was not used despite the availability of iPads and smart phones in the market. The benefits of Internet usage in trade and inventory management were not experienced. Limited benefits such as accuracy, processing speed, theft and stock shortages reduction were realized through the use of computers. However a number of challenges were faced. The major challenge was that of lack and unreliability of electricity supply that affected usage of computers. Lack of computer skills also hampered usage of computerized inventory systems. It was recommended that the SMEs be staff-developed in the use of computers. The use of iPads and smart phones should be encouraged.
Soumi Sarkar, Taniya Seal, Samir K. Bandyopadhyay
One fundamental problem in sentiment analysis is categorization of sentiment polarity. Consider a review “I like multimedia features but the battery life sucks.” This sentence has a mixed emotion. The emotion regarding multimedia is positive whereas that regarding battery life is negative. Hence, it is required to extract only those opinions relevant to a particular feature (like battery life or multimedia) and classify them, instead of taking the complete sentence and the overall sentiment. In this paper, we present an approach to identify sentiment analysis from text.
S. Mercy Gnana Gandhi
Technology is ubiquitous; stirring almost every part of our lives and our communities. It stands supreme above all and Integrating technology into classroom education leads to teaching and learning of fundamental computer skills and software programs. Presently, effective tech integration must take place across the curriculum to enhance the learning process. Active engagement, group participation, frequent feedback and link with real-world experts are the basic amenities of technology education. The innumerable resources of the online world afford each classroom with diverse learning materials. Tech tools offer numerous ways to students to experiment anything they require. Thus technology changes the methodology of teaching, offers educators useful ways to attain and assess student understanding through multiple means.
Dr. Farai Choga
The Zimbabwean economy declined between 1997 and 2008. The production in the manufacturing sector went down. The employment rate was severely affected as some of the companies closed business. The purpose of this paper is to determine the participation of women in ICT work in the Zimbabwean ICT sector. A qualitative approach was used. A qualitative research paradigm was preferred as the researcher sought detailed understanding of Zimbabwean women’s experiences as narrated by the women themselves. Data was generated through the use of face-to-face interviews as the researcher sought detailed explanation of the participants’ experiences. Purposive sampling was used to select the “information rich” participants that could provide relevant data. Data analysis showed that women were affected by family problems at work than men. Women were involved in household activities and found it difficult to be committed over long working hours in their ICT jobs. As a result men acquired more skills than women. Women were reported to possess a “pass-on” attitude whenever hard work was involved. They requested men to do it for them. In addition to that women experienced work breaks during maternity leave leading to the disturbance of skills acquisition process and as a result they lacked behind men. Women were reported to lack confidence as the culture classifies them as “weak”. This affected their mindset and they were at a disadvantage during job interviews when competing against men. In addition to that women were said to prefer less demanding and smart jobs. They were interested in routine work and afraid of challenging issues. Men were reported to be hard working but produced a lot of errors as compared to women. It was recommended that families should be educated to develop their children as equals. Government policies should be put in place to remove the mindset that looked down upon the girl child.
Prakash Chandra Bhatt and Heena Joshi
In role as a database designer, look for the most efficient way to organize your schemas, tables, and columns. As when tuning application code, you minimize I/O, keep related items together, and plan ahead so that performance stays high as the data volume increases. Starting with an efficient database design makes it easier for team members to write high-performing application code, and makes the database likely to endure as applications evolve and are rewritten. Database performance depends on several factors at the database level, such as tables, queries, and configuration settings. These software constructs result in CPU and I/O operations at the hardware level, which you must minimize and make as efficient as possible. As you work on database performance, you start by learning the high-level rules and guidelines for the software side, and measuring performance using wall-clock time. As you become an expert, you learn more about what happens internally, and start measuring things such as CPU cycles and I/O operations. The QUERY optimizer attempts to determine the most efficient way to execute a given query by considering the possible query plans. Generally database machine optimize database into different engines like Oracle,My SQL,SQL server. Here comparison of optimization of database query between oracle and mysql. How these take places into respective machine and engine.
Mohanty Anita, Prasad Suman Sourav and Mishra Sambit Kumar
Big Data is a data analysis methodology enabled by recent advances in technologies and architecture. However, big data entails a huge commitment of hardware and processing resources, making adoption costs of big data technology prohibitive to small and medium sized businesses. It is understood that cloud computing is primarily responsible to enable a relational database system’s peripherals including storage to be adjusted dynamically according to query workload, performance and deadline constraints. Complex queries usually contain common sub expressions either in a single query or among multiple queries. As it has become a problem for many companies to process such a huge amount of data using traditional computing techniques, research is being carried out to find an appropriate algorithm to find an optimal solution when the size of the database increases. Most of the data handled today are of unstructured type like the data in social sites, research engines, blogs etc. The challenges being faced with big data today is not only to store or link but also to retrieve, update and analyze them too. Cloud computing can expand and shrink as per the need of storage. Cloud computing mainly provides resources as and when needed. As big data is also a kind of resource so it is also available through cloud computing. Usually different advanced applications operate on big data may be in various stores. Cloud computing and cloud storage data also may have advantages in performance due to parallel computing, virtualization technology and data access via the web interface. Therefore, the actual task may be required to migrate existing systems and databases to the cloud. In this paper, firefly optimization technique may be applied to evaluate the performance while processing queries in big data.
Vyacheslav V. Lyashenko, Rami Matarneh, Oleg A. Kobylin
This paper focuses on the analysis and processing of blood microscopic images because of its importance to do a comprehensive analysis and diagnosis of human health. Due to the complexity of such work, we did explore various possible methods such as histogram equalization of brightness values (luminance), non-linear stretching of dynamic range of brightness values, masks filtering and fuzzy masking to get more accurate results. Color segmentation method has been used to analyze the structure of blood microscopic images. The results have showed that image segmentation increases when using fuzzy masking method, which in turn leads to increase and improve the analysis of blood microscopic images.
Arwa Shaker Bokhari, Bayan Hashr, Eman Alahmadi, Ph.D. Omar A. Batarfi
The proliferation of mobile device is rapidly increasing, which makes mobile devices one of the most usable platforms. In addition, cloud computing is the new generation of enterprise information systems, and most of the mobile applications are running in a cloud environment, which is vulnerable to intrusions and attacks. Furthermore, intrusion detection in mobile devices is challenging due to their limited capabilities. Therefore, there is a need for techniques that detect attacks in an efficient way by verifying the user’s attribute and access policy; one of these technique is an Intrusion Detection system (IDS). This paper introduces a Mobile Cloud Intrusion Detection System (MCIDS). The proposed model detected intrusion by using an IP address feature that is provided by the mobile device agent. When the intrusion is detected it will inform the cloud administrator to take a proper action. Finally, we found that adding the IP feature to IDS increased the accuracy of detection the intrusion.
Huey-Hong Hsieh, Chao-Tsung Hsiao, Ming-Chang Lee
This study develops a fuzzy multi-objective no-linear programming model for solving the problems about the regional water resource allocation planning in an ecological city. The proposed model attempts to efficiently allocate resource under the objective maximum each industrial production capital and labor, minimum the pollution emissions from various industries. These restrictions contain the land and water resources, the ability of industrial technology, capital and labor restrictions. The proposed model yields overall levels of satisfaction of industrial output, COD total emissions, SO2 total emissions, the amount of investment in various industries productive capital and number of labor productive capital. The illustration of this paper is that Capacity, COD emissions, and SO2 emissions are linear fuzzy function. Therefore, the model is available to the regional water pollution treatment planning.
Sattar J Aboud
Identity-typed encryption has given great attentions in last decant, and most of identity-typed schemes are built from bilinear parings. Thus, identity-typed scheme without pairing is of great concern in the area of public key encryption. So far, there is some challenge to build the identity-typed scheme from quadratic residues. Therefore. In this paper, we introduce a new public key encryption scheme where a public key of the singer can be selected as the known integer, for example the identity. We analyze the security of the proposed scheme, and illustrate that it is related to a problem of solving a quadratic residues assumption. We also show that the proposed scheme is selected message and it's an identity is secure in the random oracle model supposing the intractability of discrete logarithm.
Himadri Bhattacharjee and Dr.Samir Kumar Bandyopadhyay
Image steganography is the art of hiding a message, image, or file within another message, image, or file. likely, an old term in ancient greek, steganography is derived from steganos meaning ―”concealed” and graphein meaning ―”writing”, in other word we can say it refers to the science of “invisible” communication. unlike cryptography, where the goal is to secure communications from an eavesdropper, steganography techniques strive to hide the very presence of the message itself from an observer. In this research paper, a novel data-hiding technique based on the Lucas System representation of digital images is presented. A modified classical Least Significant Bit (LSB) embedding method is performed where three bits of message is embedded with in a cover image pixel using Lucas number representation of that pixel and altering only one least significant bit plane of that pixel. The Lucas representation of grey level images requires 12 bit planes instead of the usual 8 bit planes of binary representation. The central idea of the proposed method is to increase embedding capacity and security, so we have used Arnold Transform based image scrambling technique to select random pixel ordering of the cover image for efficient data hiding. The main objective of the paper is to combine both the preferences and the resistance to the visual and statistical attacks for a large amount of the data to be hidden in a cover image. Experimental results show that the proposed method has the larger capacity of embedding data, high Peak Signal to Noise Ratio (PSNR) andexperimental result shows that the proposed algorithm is highly secured with good perceptual invisibility even after different planes from the least significant bit planes is selected for embedding.
R.K.Srivastava and Atib Khan
Various methods have been developed for forecasting based on soft computing techniques to establish the relations on time series data. Proposed present study is an improved and robust method of forecasting based on soft computing techniques. The model has been developed in simple computational form algorithm. It is being implemented on time series data. The study uses the fuzzy set theory Zadeh1and fuzzy set models introduced by Song and Chissom2 and chen3. The forecast have also been obtained by developing an artificial neural network model based on back propagation algorithm. The study will provide the forecast for a lead year using a fuzzy time series model and back propagation algorithm. The forecasted value obtained through these two soft computing techniques have been compared and there performance and suitability has been examined. The robustness has been compared.