Version-1 (May-June 2015)
Ver-1 Ver-2 Ver-3 Ver-4 Ver-5 Ver-6 Ver-7
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | A Survey on Authorization Systems for Web Applications |
Country | : | India |
Authors | : | Mr. Midhun TP || Mr.Prasanth Kumar PV || Mr. Anoop Jose |
Abstract: Web services are the most important point of usage for the modern web architecture. The Service oriented architecture (SOA) used in web services offers a simple platform for integrating heterogeneous distributed web applications and service. The distributed and open nature of the present system makes it vulnerable to security issues such as Web service Description Language (WSDL) spoofing, Middleware Hijacking, etc. Assuring security for the web services to solve all security flaws is difficult. Authorization is an important aspect for assuring security. Authorization failure can create much vulnerability for the system security using web services which are distributed in nature. In this paper a survey of the authorization techniques for web services based application.
Keywords: web services, authorization, access control in web services, attacks on web services
[1]. Vorobiev, A. and Han, J. (2006) ‗Security attack ontology for web services', Proceedings of the Second International Conference on Semantics, Knowledge, and Grid (SKG'06), 2006, pp. 42.
[2]. E. Yuan and J. Tong. Attribute based access control (ABAC): a new access control approach for service oriented architectures. Ottawa New Challenges for Access Control Workshop, April 2005
[3]. SOA: principles of service design, Thomas Erl, New York: Macmillan ,2008
[4]. Martin Gudgin, Marc Hadley, Noah Mendelsohn, Jean-Jacques Moreau, and Henrik Frystyk Nielsen. SOAP Version 1.2 Part 1: Messaging Framework. W3C Recommendation,2003.
[5]. M. Naedele, ―Standards for XML and Web services security,‖ IEEE Computer, vol. 36, no. 4, pp. 96–98, Apr. 2003.
[6]. B. Thuraisingham, ―Security standards for the semantic web,‖ ComputStand. Interfaces, vol. 27, no. 3, pp. 257–268, Mar. 2005.
[7]. Schahram Dustdar, Wolfgang Schreiner .A survey on web services composition, International Journal of Web and Grid Services Volume: 1, Issue: 1, Inderscience (2005)
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | A New Skin Color Based Face Detection Algorithm by Combining Three Color Model Algorithms |
Country | : | Iraq |
Authors | : | Hewa Majeed Zangana |
Abstract: Human face recognition systems have gained a considerable attention during last decade due to its vast applications in the field of computer and advantages over previous biometric methods. There are many applications with respect to security, sensitivity and secrecy. Face detection is the most important and first step of recognition system. Human face detection suffers from various challenges due to variation regarding image conditions, size, resolution, poses and rotation. Its accurate and robust detection has been a great task for the researchers. There exist various numbers of methods and techniques for face detection but none can guarantee successful in all conditions for all kinds of faces and images. Some methods are exhibiting good results in certain conditions and others are good with different kinds of images. Face detection based on skin color is found to be more effective technique because of the properties of skin color which is unique and can be easily separated from the other objects present in the image and background.
[1]. Hong, S. and al, e, "Facial feature detection using Geometrical face model: An efficient approach," journal of pattern recognition, Vols. 31, No. 3, pp. 273-282, 1998.
[2]. Leung, C, "Real Time Face Recognition," B. Sc. Project, School of Information Technology and Electrical Engineering. University of Queesland, 2001.
[3]. Sanjay Kr, Singh, D. S. Chauhan, Mayank Vatsa and Richa Singh, "A Robust Skin Color Based Face Detection Algorithm," Tamkang Journal of Science and Engineering, vol. 6 (4), pp. 227-234, 2003.
[4]. Muhammad Tariq Mahmood, "Face Detection by Image Discriminating," 2006. [Online]. Available: http://www.bth.se/fou/cuppsats.nsf/all/6c509ae86a297ca4c12571d300512cac/$file/DVD009-MasterThesisReport.pdf..
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | Load Rebalancing for Distributed Hash Tables in Cloud Computing |
Country | : | Iraq |
Authors | : | Ahmed Hassan A/Elmutaal || Dr. Amin Babiker A/Nabi Mustafa |
Abstract: In cloud computing applications are provided and managed by the cloud server and data is also stored remotely in cloud configuration. As Cloud Computing is growing rapidly and clients are demanding more services and better results, load balancing for the Cloud has become a very interesting and important research area. Load balancing ensures that all the processor in the system or every node in the network does approximately the equal amount of work at any instant of time. In this paper, a fully distributed load rebalancing algorithm is presented to cope with the load balance problem. Our algorithm is compared against a centralized approach in a production system and a competing distributed solution presented in the literature.
[1]. Xu, Gaochao, Junjie Pang, and Xiaodong Fu. "A load balancing model based on cloud partitioning for the public cloud." IEEE Tsinghua Science and Technology, Vol. 18, no. 1, pp. 34-39, 2013.
[2]. P. Mell and T. Grance, ―The NIST definition of cloud Computing‖, online available at: http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf, 2012.
[3]. N. G. Shivaratri, P. Krueger, and M. Singhal, ―Load distributing for locally distributed systems‖, Journal Computer, vol. 25, no. 12, pp. 33-44, Dec. 1992.
[4]. Zhu, Yan, Huaixi Wang, Zexing Hu, Gail-JoonAhn, Hongxin Hu, and Stephen S. Yau. "Efficient provable data possession for hybrid clouds." In Proceedings of the 17th ACM conference on Computer and communications security, pp. 756-758. ACM, 2010.
[5]. A. Rouse, ―Public cloud‖, available at: http://searchcloudcomputing.techtarget.com/definition/public-cloud.
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | Model Based Software Timing Analysis Using Sequence Diagram for Commercial Applications |
Country | : | India |
Authors | : | Hrishikesh Mukherjee || Shrimann Upadhyay || Arup Abhinaa Achariya |
Abstract: The verification of running time of a program is necessary in designing a system with real life constrain. Verification defines lower and upper bounds which reflects control flow that depends on data as well as instruction times execution. In previous days, the bounds were very wide due to a lack of efficient control flow analysis and architectural modeling techniques. But in present days, significant progress have been occurred in both areas in a way such that execution cost of formal software analysis has become more practical. In previous days various work has been done on code based complexity analysis for embedded software. This type of analysis is called as worst case execution time analysis(WCET). In our research we are applying the concept of timing analysis for UML model based timing analysis, by which we can minimize the effort of calculating timing from code snippet and it will also help the developer to identify the timing requirement since requirement stage.
Keywords: Verification, Worst Case Effective Time, UML Models
[1]. Y .Li and S. Malik, Performance Analysis of Real-Time Embedded Software, Norwell, MA: Kluwer, 1999.
[2]. Object Management Group: UML 2.0 specifications, OMG adopted specification 2003, "http://www.omg.org.
[3]. A. V. Aho, R. Sethi, J. D. Ullman, Compilers: Principles, Techniques & tools, Reading, MA: Addison-Wesley, 1992.
[4]. Hartmann, M. Vieria , H. Foster and A. Ruder, A UML-based Approach to System Testing, Journal of Innovation System Software
Engineering, Vol. 1, PP. 12-14, 2005.
[5]. D Jeya Mala, S Geetha, Object Oriented Analysis and Design Using UML, McGraw Hill Education.
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | Reversible Watermarking based on Histogram Shifting Modification:A Review |
Country | : | India |
Authors | : | Anamika Jain || Namita Tiwari |
Abstract: When we talk about the data communication or exchange of information from sender to receiver, the major concern is DATA. This (data) being of high importance, prone to various kinds of attacks. In order to protect it, data hiding techniques are used. Some of the data hiding techniques available are digital watermarking, steganography, etc. By digital watermarking, data is hidden in any type of multimedia like images, audio, etc. Watermarking is one of the major solution to provide authenticity and copyright protection to the digital data. But watermarking causes damage to the cover signal (signal in which data are embedding), therefore recovery of the original cover signal is not possible at the receiving end, which is not tolerable in many fields like medical, military, etc.
[1]. C. De Vleeschouwer, J. E. Delaigle, and B. Macq, "Circular interpretation of histogram for reversible watermarking," 2001 IEEE Fourth Work. Multimed. Signal Process. (Cat. No.01TH8564), pp. 345–350, 2001.
[2]. C. De Vleeschouwer, J. F. Delaigle, and B. Macq, "Circular interpretation of bijective transformations in lossless watermarking for media asset management," IEEE Trans. Multimed., vol. 5, no. 1, pp. 97–105, 2003.
[3]. Z. N. Z. Ni, Y.-Q. S. Y.-Q. Shi, N. Ansari, and W. S. W. Su, "Reversible data hiding," IEEE Trans. Circuits Syst. Video Technol., vol. 16, no. 3, pp. 354–362, 2006.
[4]. Chia-Chen Lin, Wei-Liang Tai and Chin-Chen-Chang. "Pattern Recognition," vol. 41, pp. 3582–3591, 2008.
[5]. P. Tsai, Y. C. Hu, and H. L. Yeh, "Reversible image hiding scheme using predictive coding and histogram shifting," Signal Processing, vol. 89, pp. 1129–1143, 2009.
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | Enhancing the Usability of Library System at CSIBER using QR Code |
Country | : | India |
Authors | : | Dr. P.G. Naik || Dr. R.S. Kamath || Mrs. S.S. Jamsandekar || Mrs. K.S. Mahajan || Mr. M.B.Patil |
Abstract: With the advancement in information technology, the information is no longer confined to a single physical location. With the parallel advancement in mobile technology, the information is available anywhere and at any time with a single click. The quick response (QR) code has further rendered it easier to access information using QR scanner installed smart phones without memorizing complex web addresses. The intent of this research is to enable an end user a quick access to the CSIBER library resources by revealing its current location in the library. To enable this an open source tool is designed and developed which queries an end user for book inforation and instantly reveals its physical location in the library.
[1]. Jones Steve, The Internet Goes to College: how students are living in the future with today‟s technology (Pew Internet and American Life Project: Washington. Available:http://www.pewinternet.org/reports/pdfs/PIP_College_Report.pdf , 2002, Accessed: 2015: April 17)
[2]. Aldrich, Alan W. (2010). Universities and Libraries Move to the Mobile Web.
[3]. Lombardi, John V., Academic Libraries in a Digital Age, D-Lib Magazine, 6(10), 2000
[4]. Ching-yin Law, Simon So, QR Codes in Education, Journal of Educational Technology Development and Exchange,3(1), 2010, 85-100.
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | Improved Max-Min Scheduling Algorithm |
Country | : | India |
Authors | : | Navdeep Kaur || Khushdeep Kaur |
Abstract: In this research paper, additional constrains have been considered to progress a holistic analysis based algorithm based on Max-Min algorithm, which work on principle of sorting jobs(cloudlets) based on completion time of cloudlets. The improved algorithms here also reviews the job characteristics in method of size, pattern, payload ratio and available storage blocks in particular cluster of contribution of file systems. The observations show no significant overload due to addition of these constrains, as sorting operation remains same and efficient. Storage allocation helps in getting better performance.
Keywords: Cloud based computing, Max-Min algorithm.
[1]. Er. Shimpy and Mr. Jagandeep Sidhu," Different scheduling algorithm in different Cloud computing", In International Journal of Advanced Research in Computer and Communication Engineering Issue 9, September 2014 .
[2]. Yogita Chawla and Mansi Bhonsle ," A Study on Scheduling Methods in Cloud Computing", In International Journal of Emerging Trends & Technology in Computer Science (IJETTCS) September – October 2012.
[3]. Pinal Salot ," A Survey of various scheduling algorithm in cloud computing environment",In International Journal of Research in Engineering and Technology Feb-2013.
[4]. Yogita Chawla and Mansi Bhonsle ," A study on scheduling methods in cloud computing", In International Journal of Emerging Trends & Technology in Computer Science (IJETTCS) September – October 2012.
[5]. Shah Mihir and Asst. Prof. Yask Patel," A Survey Of Task Scheduling Algorithm In Cloud Computing",In International Journal of Application or Innovation in Engineering & Management (IJAIEM) January 2015.
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | Application of Support Vector Machine and Fuzzy Logic for Detecting and Identifying Liver Disorder in Patients |
Country | : | Nigeria |
Authors | : | Ejiofor C.I || Ugwu C. |
Abstract: The liver is the largest organ in the body, and plays a major role in food digestion and process in the body. Based on the role it plays, it has a higher chance of coming in contact with harmful product that goes into the body.This system first screens the patients then identifies if any, the particular liver disorder suffered by the patient. The diagnosis of liver disorder has been subjective at best, based on subjective approaches. This research paper has proposed a hybrid system utilizing fuzzy logic for approximation and handling of noisy or incomplete while appropriate classification was handled utilizing Support Vector Machine (SVM)
Keywords: Liver Disorder, Fuzzy logic, SVM, Dataset
[1]. Ahmad H. (2011), "Fuzzy approach to Likert Spectrum in Classified levels in surveying researches" retrieved http://www.tjmcs.com.
[2]. Angel C. and Rocio R. ( 2011), "Documentation management with Ant colony Optimization Metaheuristic: A Fuzzy Text Clustering Approach Using Pheromone trails" retrieved from soft computing in Industrial applications, Advances in intelligent and soft Computing, vol. 96, 2011, 261-70, DOI: 10.1007/978-3-642-20505-1_23
[3]. Christos S. and Dimitros S. (2008) "Neural Network", retrieved from http://www.docstoc.com/docs/15050/neural-networks
[4]. Healthline, (2014), Liver Disorder and treatment, retrieved online from http://www.healthline.com
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | Image Based Relational Database Watermarking: A Survey |
Country | : | India |
Authors | : | Sapna Prajapati || Namita Tiwari |
Abstract: In past few years relational databases watermarking has emerged a great topic for research because of increase in use of relational databases. The basic need for relational database watermarking is to prevent data from illegal access by providing copyright protection, tamper detection and maintaining integrity. To serve this purpose many database watermarking techniques have been addressed with different algorithm and cover type. With the use of watermarking unauthorized duplication and distribution can be detected. The watermarking scheme should be able to meet some important challenges: 1).
[1]. Agrawal, R., & Kiernan, J., 2002. Watermarking relational databases. In Proceedings of the 28th very large data bases VLDB conference, Hong Kong, China (Vol. 28, pp. 155–166).
[2]. Dwivedi, A. K., Sharma, B. K., & Vyas, A. K. (2014). Watermarking Techniques for Ownership Protection of Relational Databases, 4(1), 368–375.
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | Analyzing and Surveying Trust In Cloud Computing Environment |
Country | : | India |
Authors | : | Kavita Rathi || Sudesh Kumari |
Abstract: Cloud computing is the most discussed research area now-a-days which helps to provide elasticity and flexibility in using the computing resources and services to fulfill the requirement of current businesses. Besides many advantages offered by cloud computing, it deals with many obstacles in the path of its growth, that are security issues, data privacy issues and distrust on cloud service providers (CSP). Trust is found to be an essential element for achieving security and confidence in the use of distributed computing. Various issues like data control, ownership, data integrity and security can be considered as important parameters of trust. This paper addresses the existing trust models for trust establishment in cloud services and also tries to find out the shortcomings of these models. Keywords- Trust, Issues of Trust, Trust Models, User Based Trust Models.
[1]. Ankush Dhiman, Mauli Joshi, "Analysis of Performance for Data Center under for Private Cloud through Cloud Computing", International Journal of Engineering and Computer Science(IIECS) issn:2319-7242, vol.3 Issue 6, Page. 6422-6431, June 2014.
[2]. Hong Cai, Ning Wang, Ming Jun Zhou, "A Transparent Approach of Enabling SaaS Multi-tenancy in the Cloud", IEEE 6th World Congress on Services, 2010.
[3]. Wayne A. Jansen, NIST, "Cloud Hooks: Security and Privacy Issues in Cloud Computing", Proceedings of the 44th Hawaii International Conference on System Sciences, 2011.
[4]. Michael Armbrust, Armando Fox, et al., "A view of Cloud Computing", Communications of the ACM, vol. 53, April 2010.
[5]. Kai Hwang, Deyi Li, "Trusted Cloud Computing with Secure Resources and Data Coloring", IEEE Internet Computing, 2010.
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | Comparative study on Cache Coherence Protocols |
Country | : | India |
Authors | : | Kaushik Roy || Pavan Kumar S.R. || Meenatchi S. |
Abstract: In this new age of technology, not only the software but also the computer architecture has been evoluted to support those softwares. The main motive of evolution of architecture day by day is to make the system faster. One of the major steps in this journey of evolution is the multi-core processor architecture. In a multi-core processor system, each core has its own cache module where they are sharing the same memory unit. For that reason, one block in one cache gets invalidate when the same block is updated into any other cache and this is called cache coherence problem. To overcome this, there are lot of research works are going on and the outcome rules or techniques are called cache coherence protocols.The main objective of this paper is to collect all those research works together and represent them in an easy way, so that we could understand their techniques to overcome the particular problem. Here it is presented a comprehensive study of those cache coherence protocols with their pros and cons.
[1]. Fradik Dahlgren and Per Stenstrom,"Using write caches to Improve Performance of cache coherence Protocols in Shared Memory Multiprocessors", Journal of Parallel and distributed Computing, 1995.
[2]. Anant Agarwal, RichardSimoni, John Hennessy and Mark Horoiwitz, "An Evolution of Directive schemes of for cache coherence", IEEE 1998.
[3]. Milo M.K. Martin, Mark D. Hill, David A. Wood,"Token Coherence: A New Framework for shared-memory Multiprocessors", IEEE 2003
[4]. Liqun Cheng, John B. Carter, DonglaiDai,"An Adaptive Cache Coherence Protocol Optimized for Producer-Consumer Sharing", 2007
[5]. Muthukumar.S and Dhinakaran.K, "Hybrid Cache Coherence Protocol for Multi-Core Processor Architecture", International Journal of Computer Applications in May 2013
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | Internet Worm Classification and Detection using Data Mining Techniques |
Country | : | India |
Authors | : | Dipali Kharche || Anuradha Thakare |
Abstract:Internet worm means separate malware computer programs that repeated itself and in order to spread one computer to another computer. Malware includes computer viruses, worms, root kits, key loggers, Trojan horse, and dialers, adware, malicious, spyware, rogue security software and other malicious programs. It is programmed by attackers to interrupt computer process, gatherDelicate Information, or gain entry to private computer systems. We need to detect a worm on the internet, because it may create network vulnerabilities and also it will reduce the system performance. We can detect the various types of Internet worm the worm like, Port scan worm, Udp worm, http worm, User to Root Worm and Remote to Local Worm.
1]. N. Weaver, V. Paxson, S. Staniford and R. Cunningham, "Taxonomy of computer worms," Proc of the ACM workshop on Rapid malcode, WORM03, 2003, pp. 11-18.
[2]. C. Smith, A. Matrawy, S. Chow and B. Abdelaziz, "Computer Worms: Architecture, Evasion Strategies, and Detection Mechanisms," J. of Information Assurance and Security, 2009, pp. 69-83.
[3]. M. M. Rasheed, N. M. Norwawi, O. Ghazali, M. M. Kadhum, "Intelligent Failure Connection Algorithm for Detecting Internet Worms", International Journal of Computer Science and Network Security, Vol. 9, No. 5, 2009, pp. 280-285.
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | A novel algorithm to protect and manage memory locations |
Country | : | Iraq |
Authors | : | Buthainah F. AL-Dulaimi || Sawsan H. Jaddoa |
Abstract: Most of security vulnerabilities continue to be caused by memory errors, and long-running programs that interact with untrusted components. While comprehensive solutions have been developed to handle memory errors, these solutions suffer from one or more of the following problems: high overheads, incompatibility, and changes to the memory model. Address space randomization is a technique that avoids these drawbacks, but do not offer a level of protection. To overcome these limitations, we develop a new approach in this paper that supports comprehensive randomization, whereby the absolute locations of all (code and data) objects, as well as their relative distances are randomized. In particular, we have successfully deployed precise method in the implementation of a language run-time system. Our approach is implemented as a fully automatic source-to-source transformation, the address-space randomizations take place at load-time or runtime, so the same copy of the binaries can be distributed to everyone - this ensures compatibility with today's software distribution model.
Keywords: Garbage collection, memory errors, memory protection, memory reallocation, transformation, program locations.
[1]. S. Bhatkar, D. C. Du Varney, and R. Sekar, Address obfuscation: An efficient approach to combat a broad range of memory error exploits, In USENIX Security Symposium, Washington, DC, August 2003.
[2]. S. McPeak, G. C. Necula, S. P. Rahul, and W. Weimer, CIL: Intermediate language and tools for C program analysis and transformation, In Conference on Compiler Construction, 2002.
[3]. J. Baker, Antonio Cunei, FilipPizlo, and Jan Vitek, Accurate garbage collection in uncooperative environments with lazy pointer stacks, In International Conference on Compiler Construction, 2007.
[4]. M. Hirzel, A. Diwan, and J. Henkel, On the usefulness of type and liveness accuracy for garbage collection and leak detection. ACM Transaction Program. Language System, 24(6), 2002, 593–624.
[5]. S. M. Pike, B. W. Weide, and J. E. Hollingsworth, Checkmate: cornering C++ dynamic memory errors with checked pointers. SIGCSE Technical Symposium on Computer Science Education, 2000, 352–356.
- Citation
- Abstract
- Reference
- Full PDF
Paper Type | : | Research Paper |
Title | : | Ant Colony Optimization for Wireless Sensor Network: A Review |
Country | : | India |
Authors | : | Benu || Chakshu Goel || Sheenam |
Abstract: A wireless sensor network is a gathering of specific transducers with a correspondences foundation for observing and recording conditions at diverse areas. Ant colony optimization algorithm (ACO) is a probabilistic procedure for tackling computational issues which can be lessened to discovering great ways through diagrams. Clustering is the undertaking of collection a set of items in such a route, to the point that questions in the same gathering are more comparative (in some way or an alternate) to one another than to those in dissimilar gatherings (clusters).We will execute this by utilizing NS-2 Simulator.
Keywords: Ant Colony Optimization, clustering, energy efficiency, WSN
1]. Y. Liang and R. Wang, A Biologically Inspired Sensor Wakeup Control Method for Wireless Sensor Networks, IEEE Transactions On Systems, Man and Cybernetics, pp. 525-538, 2010.
[2]. L.Yanfei, Q. Xiaojun and Z. Yunhe , An improved design of ZigBee Wireless Sensor Network, 2nd IEEE International Conference onComputer Science and Information Technology, pp. 515-518, 2009.
[3]. H.Modares and A,Moravejosharieh, "Overview of Security Issues in Wireless Sensor Networks, IEEE Third International Conference onComputational Intelligence, Modelling and Simulation, pp. 308-311, 2011. [4]. N.Marriwala and P.Rathee, An approach to increase the wireless sensor network lifetime, IEEE World Congress on Information and Communication Technologies, pp. 495-499, 2012. [5]. R.Mittal and M.P.S Bhatia, Wireless sensor networks for monitoring the environmental activities, IEEE International Conference onComputational Intelligence and Computing Research, pp. 1-5, 2010.