A workshop on Knimbus Mobile Library Application was held on 5.12.2018 in Reference Section of Central Library. Mr Anuj Kumar Aggarwal from GIST, Gurgaon was Resource Person on this subject.Senior faculty members and library staff attended this workshop.
Readers who want to register on this Knimbus platform can register themselves by sending particulars like Name, Email ID and contact number through mail ID :firstname.lastname@example.org
IMPLEMENTATION OF MAYNARD OPERATION SEQUENCE TECHNIQUE IN AN ENGINE MANUFACTURING INDUSTRY- A CASE STUDY by Bipandeep Singh
Maynard Operation Sequence Technique (MOST) is a predetermined motion time system that is used
primarily in industrial settings to calculate the standard time in which a worker should perform a task. This technique breaks down the work into small discrete elemental activities made up of various parameters and calculates the standard time for each activity from predetermined motion index data cards. MOST is a powerful analytical tool that helps increase productivity, improve methods and facilitate planning. ABC Company produces tractor engines of various powers. But demand of 54bhp engine is very high in market. The company can assemble 70 engines in one shift but demand of this particular engine is more than 78 engines. By implementing the MOST in the assembly shop the standard time for the activities is calculated to be 66.31 minutes whereas the actual existing average time is 71.37 minutes. After using pareto analysis for all the 128 elemental activities it was found that there are some activities which are consuming more time because of excessive work content. After reducing these activities, the time for assembly was further reduced down to 62.7 minutes. By achieving this standard time, 9030 man hours will be saved annually that will save Rs.9,03,000. By doing so, the production will increase from 70 to 75 engines.
IMPROVING ENERGY EFFICIENCY OF MANETs USING PROTOCOL DRIVEN BY CPE by Shyna Kalra
There are constantly moving nodes in mobile ad-hoc network whereas in Wireless sensor network, the nodes are motionless. There are many issues that the mobile ad hoc network confronts because of constant topology changes. These issues are about packet loss, link breakage and so on, which are necessary to be considered for the enhancement of network services. There is another issue that also need to be considered is network’s lifetime. The network’s lifetime hangs on the remaining energy levels of the nodes. Therefore, in this case, the preservation of energy levels of mobile nodes can make the network to work for longer duration. In order to achieve energy efficiency, it is necessary to improve the transfer of data between two nodes i.e. source node and destination node. The main focus of this study is at optimizing the track from source to destination using the pheromone value of nodes during route reply phase. It can also be enhanced by using the coordinate based method while broadcasting the route request messages. Also in case to optimize performance of the network, the concept of EENCP has been taken into account so that energy of the nodes is also considered while making path from source to destination. The proposed scheme as well as existing schemes were implemented on NS2.35 and the throughput, energy consumption and routing overhead were used to analyse performance of the network. The proposed scheme has lead to energy consumption of 11 Joules whereas the existing scheme causes energy consumption of 12.5 Joules in the network. The lesser energy consumption in the proposed scheme shows an enhancement in lifetime of network. Similarly, better network must have higher values for throughput and the proposed scheme shows destination node received more amount of the data. The values of the throughput for the proposed scheme at 260 Kbps are higher than the existing scheme at 210 Kbps indicating better performance of the network. The values for the routing overhead for the proposed scheme is found to be 8.5 and for the existing scheme it is 18.5 approx. which indicates the establishment of better routing track between source and destination node. This is attributed to the consideration of the pheromone values concept derived from ant colony optimization to select the track between the front and end node to transmit the data. The ant colony concept optimizes the path in a better way than other traditional routing protocols which has been proved in our study also.
Efficient Extraction of BIM Objects and Other Structural Elements by Mayur Jethwa
Technology has changed the face of the world, with the advent of technology most of the manual works have been shifted towards autonomous processes. However, when these processes fail to interact often causes a bigger inconvenience. Therefore, lot of open standards has come into play to minimize these inconveniences. An IFC is an open standard that was created to address interoperability issues, therefore, popularity of an IFC ecosystem is rising by leaps and bounds, yet IFC fails to take into account human error or human ignorance, relationship that are not created during design is not reflected in an IFC file, to locate information manually is not a mundane task and often out of the scope of the validators resulting in safety hazards and compromised convenience. The very building that gives life, service, protection and happiness can take it back. Sources say that one of the top reasons for a building failure is human error. This error can be minimized if all structures are created by standards mentioned in national structure safety codes. To ensure that all the documented safety codes have been thoroughly verified it is necessary to perform this task using autonomous machine instead of a manual process. For all these reasons this thesis focuses on automated element extraction of structural entities, its children and their properties, so it can be further used by a structural validating software to ensure that a structure is made as per compliance of structural safety codes, and to verify that a structure is maintainable, appealing and convenient.
APPLICATION OF FAILURE MODE CRITICAL ANALYSIS IN CONCRETE MIX BATCHING PLANT USING MCDM METHODS by Lovelesh Seth
for the development of country construction industry plays a vital role in growth and development. Indian construction industries are required to make a world class vision for the forthcoming world class projects in India as well as in abroad. For production in construction industry concrete mix batching plant is considered as one of the most critical equipment which helps in practising or mixing the various concrete mix ingredients either by mass or by its volume collectively into the mixing unit in plant as so to get the desired quality concrete mix as per design. In construction industry batching plants are highly loaded with work in order to complete the set target. So there is a huge risk of failure of those machines as the machines are overloaded. So machines require proper and timely maintenance strategy so as to avoid sudden breakdown during its operation which may lead to not even the stoppage of that machine but may also lead to the stoppage of complete production which involves that machine. The research work is carried out considering a case study held in one of the most renowned construction company LARSEN and TOUBRO Limited at its project site at Six Lanning of Delhi Agra Road Project. The main aim of the research work carried out in the thesis was to allocate the objective weights to maintenance criterion. Shannon Entropy approach is utilized for calculating the criteria weights so as to find out the weight of criterion and these weights are used for maintenance criticality ranking which is generated by using three multi criteria decision making (MCDM) approaches(SAW, TOPSIS, VIKOR). Best maintenance strategy is developed considering twelve probable reasons of failure recognized, the most critical modes of failure according to TOPSIS, VIKOR and SAW rankings are Loosen blade bolts (D9), Mixer Shaft facial axial seal (D1), Solidification Time (D10), Load cell wiring (D12) respectively.
STABILIZATION USING PAPER MILL SLUDGE ASH AND SAW DUST ASH by Jasvir Singh
One of the most problematical soils are expansive soils which are found on every part of earth except polar zones. One of the main problem is their shrinking and swelling behaviour when they come in contact with water. When structures are constructed on these soils, these causes differential settlement which may cause economic burden on developers. Structural stability comes in danger if proper remedial measures are not taken. Many methods and techniques are used to prevent harm caused by these soils. One of best technique which is used since old times, that is soil stabilization. In ancient times additives are added in soils to stabilize it. Now a day many other additives like polymers, waste materials, salts etc. are used to stabilize soil with combination of basic additives like cement, lime, fly ash, bitumen etc. Proper gradations of these additives are added for treating the soil effectively. At present-day waste management is a big problem for industries which is increasing gradually. Many researchers are doing research to use waste materials for soil stabilization which can solve both problems, soil stabilization and waste management. These researches are based on fact that to understand the reaction between lime rich wastes and soil. This study was thus done to study the effect of lime-rich paper mill sludge ash and saw dust ash on properties of expansive soils. Main Properties which are investigated was California bearing ratio and strength. Unconfined strength test was performed to inspect the strength of soil. Also in this test stress strain curve was drawn to check whether material is becoming brittle or ductile with the addition of waste materials. CBR was performed also performed to check suitability of soil for subgrade in flexible pavement. Apart from these properties, the other engineering properties which examined were moisture content and dry density of soil which was needed in UCS and CBR and also atterberg limits and grain analysis of soil sample. Materials which are used for study accompanied by soil samples were paper mill sludge ash which results from burning of sludge produced in paper mills and saw dust ash which produced after burning saw dust. In this study it was noted that soil treated with both two ashes is related to soil treated with lime and cement.
OPTIMIZATION OF RCC COLUMN SUBJECTED TO AXIAL LOAD AND UNIAXIAL MOMENT by Ansh Khurana
In the modern development of structures, economy plays a vital role in the industry. So, for the maximum profit for builders and clients, an economical structure that is safe, serviceable and durable, is needed. Now, the cost of the structure can be minimized without deflating the material can only be done by finding a case for each of the structural members/components gives the minimum cost keeping the strength and other parameters satisfied. Columns are important vertical structural elements constructed integrally with framing beams and slabs to carry axial forces and bending moments. For optimization, number of techniques are being used by the researchers. Every technique has its own advantages and disadvantages. In the present work one of the latest developed techniques, namely ‘Ray optimization’ has been used for optimization of RC columns. This technique is based on phenomena of refraction of light. Since the technique is based on a physical phenomenon, it is easy to understand and use. Convergence of the process though depends on certain factors like the size of search space, refractive index, number of local minima etc. In this research, RC columns subjected to axial loading and uniaxial moment have been optimized. The column design depends on many factors as indicated in the interaction diagrams like eccentricity of loading, size of the column cross section, percentage of steel, position of neutral axis, grade of steel, and grade of concrete. Thus, a MATLAB program has been developed for column design with analytical formulae that doesn’t involve use of graphs. Also, a program of Ray optimization algorithm has been written in MATLAB editor and saved as functions. After writing both the programs, they have been associated with each other to work as an optimization tool for column design. Two variables namely depth of neutral axis and percentage of steel in column are considered as independent variables of the optimization problem. Variables like grade of concrete, grade of steel, length and loading are taken as inputs. The algorithm has been tested on certain standard mathematical functions to confirm its veracity and the results obtained thereby were found in concurrence with the standard results. Number of columns for different loadings were designed to validate the effectiveness of ray optimization technique. To check the robustness of the algorithm the optimization process was run multiple times. From the study it could be observed that the most optimum sections are with the cross-sectional dimensions having minimum width and minimum percentage of steel i.e. 0.8%. The study was also carried out to see the effect of different parameters like grade of steel, grade of concrete, number of design agents, variation in refractive index values etc on the optimum results. The observations of which came out to be that with the increase of grade of concrete or steel reduces the column section and thus gives more economical designs. With increasing the no. of agents, the optimum results can be obtained in less no. of iterations and for refractive index 0.5-0.8, the results are most optimum.
POSSIBILITY OF RTW & CDW IN STONE COLUMN TO IMPROVE BEARING CAPACITY OF CLAYEY SOIL by Gagandeep Singh
The foundation is the main part for any structure in Civil Engineering which rests on the soil, so ultimately all the load of the structure transfers to the ground. The soil under the foundation should have safe bearing capacity, so that soil beneath the foundation should not fail. As the construction of the superstructure mainly depend upon foundations of that structure, whole structure has been erected on the soil of suitable bearing capacity. But the value of bearing capacity decides the amount of improvement to be done. The improvement may be done by use of piles, piers, caissons and stone columns. Material to be used in stone column are aggregates up to size 100mm. Waste materials generated such as Rubber and Concrete demolition waste can be used as replacement of aggregates. In this present study clay of medium plasticity (CI) used was collected from village lohatbaddi, district Ludhiana (PB). Concrete demolition waste (CDW) was collected from waste of cubes tested in concrete laboratory. Rubber tyre waste (RTW) in crumbed powder form was collected from Speedways tyre industry, Transport Nagar, Ludhiana. In this study an attempt was made to use CDW and RTW in improving bearing capacity of the soil. The percentage of RTW: CDW (0:100, 20:80, 40:60, 60:40, 80:20, 100:0) was used in this present study. The optimized value of RTW: CDW ratio for single column is (20:80) and optimized L/D ratio of column is 6. Then this percentage was used for L/D ratios 3, 6 & 10 for the number of columns 1,2,3,4&5.The allowable bearing capacity for L/D ratio 6 is more than for L/D ratio 3 & 10 and it was maximum for five number of columns. The allowable bearing capacity with five stone columns was 2-3 times the bearing capacity of soil without
ASSESSING SEMANTIC INFORMATION OF VOLUNTEERED GEOGRAPHIC INFORMATION
by Gursimar Kaur
The world of cartography and map making has changed dramatically with the advent of new technological innovations and emergence of the hand-held mobile devices. The features like web mapping and navigation using electronic maps have made the paper maps outdated which lead to introduction of new phenomenon where the volunteers, also known as private citizens collaborate to share geographical information using a popular project called as Volunteered Geographic Information (VGI). Inspired from Wikipedia, OpenStreetMap (OSM) is the most successful project of VGI used for web mapping. It allows its contributors the freedom of global participation to collaborate their local knowledge for open access to everyone. Due to the open tagging scheme, the contributors augment the noisy and ambiguous data as the users have the freedom to use either previously generated tag or define their own. Further, a strict specification model is not used to audit the quality of the contributed data. The aim of the study is to assess the semantic similarity of the tags used to name the geographical feature by taking help of various string searching algorithms. This study implemented the algorithms to measure semantic similarity score of data under observation by assessing the attributes of tags and further divided the results as acceptably similar or not depending on the desired threshold value. The assessment of positional accuracy of linear features depicting real world geographical representation was achieved using the technique involving the creation of a constant width buffer around a line when a circle of fixed distance (also named as epsilon band) is rolled along both sides of the line. The designed approach helped to achieve data completeness and analyse the level of correlation in the given attribute constraints. Comparing the features with the dataset of higher accuracy, the evaluation is not limited to OSM and can be generalized using any other database crowdsourced by volunteers. The developed algorithms gave its contribution to enhance the enormous potential of the ever-rich dataset by improving its quality and alleviating the semantic gap in geospatial information.
AN OPTIMIZED LUNG CANCER CLASSIFICATION SYSTEM FOR COMPUTED TOMOGRAPHY IMAGES by Sheenam Rattan
Amongst diverse cancers, lung cancer is measured to be the foremost reason of cancer demise with utmost demise pace. Nodules lying on lungs have distinct structures, they could be either circle or coil shaped which under various circumstances composes the recognition complex. In this work a system has been urbanized for detection of lung cancer in its early stages and classification between malignant and benign tumors via images from Computerized Tomography (CT) scanner. Lung cancer detection process has four steps which includes pre-processing phase, segmentation, feature extraction and lung cancer cell classification. BAT Algorithm is applied to provide considerable optimization results which improves the performance of system. The classification between malignant nodules and benign has been done through Artificial Neural Network Ensemble to provide results of higher accuracy. The overall accuracy, sensitivity and specificity of 98.5%, 100% and 91% respectively is acquired in the system.
PERFORMANCE EVALUATION AND ANALYSIS OF CONVOLUTION CODING ON SDR PLATFORM by Gaganpreet Kaur
All the devices and computers work more efficiently than pervious time due to rapid increase of technologies. In yesteryears practical research was very costly in terms of cost and time because that time prototype-circuit boards were used for testing any possible models. In present scenario, computers are powerful enough that it can do digital signal processing tasks which were done by dedicated devices. Moreover, people are becoming techno-savvy and affordable personal computers which are used at homes are also able to do necessary computation in same manner like dedicate devices are executing. Software Defined Radio (SDR) is this correlative kind of device. Translation of the signal processing into software that is run by a regular computer releases a huge number of possibilities at an affordable price. Therefore, easily examine and modify every value of the wireless communication system with the help of software defined radio. The demand of wireless communication systems is increasing day by day. With more and more users to accommodate, the transmission of data over a crowded channel results in loss of data and errors. Therefore, there is need for the platform that is enough flexible to accommodate large number of users with low probability of error. Channel coding is the most important techniques that helps to represent the transmission of source code in such a way that error probability minimizes over the crowded noisy channel by systematic addition of redundant bits. It helps to improve the error rate performance of the communication system, results in better reception. With the increasing demand for efficient standards, Software Defined Radio provides flexibility with low cost solutions for today’s wireless communication needs. In SDR, all the signal processing is done at the software level. In this work, Forward Error Correction codes i.e., convolution coding has been implemented on SDR because it has ability to improve the BER and SER performance of conventional modulation techniques such as PSK and QAM. Audio encoders, which are used for speech compression in radios and other sound systems, can also be enhanced using convolution coding. This will lead to development of new standards that can revolutionize the field of wireless communication.
DESIGN AND DEVELOPMENT OF TOOL FOR ASSESSING OSM COMPLETENESS by Sonali Arora
OpenStreetMap (OSM) is a cumulative effort to create a free presentable map of the world that can be accessed by anyone. OSM is one the most prevalent instance of Volunteered Geographic Information (VGI). OSM has become one of the great alternative source for Geodata in the recent years. Since OSM is generating large amount of spatial data that has been contributed by users with the different level of mapping experiences and different backgrounds, hence the quality of OSM can vary strongly. For this different studies have been done in which different aspects has been investigated. In most of the studies, ground truth reference datasets have been used for comparison of the data which is called the extrinsic analysis. But extrinsic analysis is not always possible because of lack of availability of ground truth reference datasets. Hence, intrinsic analysis can serve as prominent basis for making the approximate statements on the quality of OSM. The investigation analyses the existing intrinsic frameworks and its limitations and then proposed the new six quality parameters for assessing the completeness of the OSM data effectively. A framework has been developed on the basis of proposed parameters. The results obtained from execution helps in doing the statistical analysis and interpretation by providing the visualizations in the form of barcharts, graphs, tables and maps to assess the quality of data without the help of any ground truth reference data sets. This enables arbitrarily OSM completeness assessment for any part of the world.