We study the matter of on-line multi-task learning for finding multiple connected classification tasks in parallel, aiming at classifying each sequence of knowledge received by every task accurately and expeditiously. One sensible example of on-line multi-task learning is the micro-blog sentiment detection on a gaggle of users that classifies micro-blog posts generated by every user into emotional or non-emotional classes. Initial of all, to satisfy the vital requirements of on-line applications, an extremely economical and scalable classification resolution which will create immediate predictions with low learning value is required. This demand leaves standard batch learning algorithms out of thought. Second, classical classification strategies, be it batch or on-line, typically encounter a quandary once applied to a gaggle of tasks, i.e., on one hand, a single classification model trained on the complete assortment of knowledge from all tasks could fail to capture characteristics of individual task; on the other hand, a model trained severally on individual tasks could suffer from insufficient coaching information. To beat these challenges, during this paper, we tend to propose a cooperative on-line multi-task learning methodology that learns a world model over the complete data of all tasks. We also evaluate it on 3 real-life problems—spam email filtering, bioinformatics information classification, and micro-blog sentiment detection. Experimental results show that our methodology is effective and scalable at the web classification of multiple connected tasks.