In this class I have learned about leadership. Also, I have learned more about technology in education. Not working as an educator makes me have a different perspective on how technology can be applied, but it was interesting to get the opinions of teachers and the different problems they have to deal with in regards to technology in their classroom. For me, it was really interesting to share opinions with fellow students, we all have different backgrounds in regards to education and technology and this discussions allowed us to learn from each other, and most importantly to see things from another perspective. I think this class has provided me with the necessary tools to prepare myself to be a leader in education. From the different podcasts and examples provided by Dr. Newberry to the discussions with my fellow classmates, they all provided great information that I can take into account when I'm faced with a decision that will benefit learners in the use of technology in educational settings.
Overall this was a great class. I enjoyed the different projects because not only did I aquire new technology skills, but I was engaged in projects that will serve me in my job as well as in my future as a professional in instructional technology. This class being online was great and it promoted greater communication with classmates than the classroom, sometimes we do not engage in conversations with all of the students in class, but using the blogs allowed us to communicate with everybody at least once.
Activity Log:
Listened to podcast
Posted Responce
Responded to other posts
Tuesday, March 13, 2007
Thursday, March 8, 2007
Session 9
At Cal State I have attended many excellent professional development sessions. I used to work for a technology grant and as part of our activities we provided professional development workshops for our College of Education faculty. I believe that many of these sessions were great examples of a great professional development experience. The presenters were experts in the different topics we covered and allowed for hands on experience in every workshop. The pace of the presentation always allowed them to answer questions from the audience and not exceed the time allowed. Every workshop concluded with a Q&A section to make sure that the attendees understood the concepts and can applied them to their courses. I believe that these workshops were an excellent example because of the experience of the presenters, mostly comprised of professional presenters from companies working with the grant, CSUSB faculty, and professional development staff at the department of academic computing and media. Also, the topics were always individualized to the topic of integrating technology in the classroom. Other characteristics were that the workshops were short, not more than two hours, and that each one distributed and collected evaluation forms. Evaluations are important in PD because they can be used to improve your presentations.
Recently I attended a session that could be considered a poor example. This I think is mainly due to the inexperience of the presenter. The presenter is very knowledgeable about the topic, but lacks the overall experience on how to address the many parts of the presentation. One example is that his presentation was constantly stopped to answer questions from the audience. It is important to answer questions, but it is also very disruptive for the rest of the audience. Also, this training covered the use of 3 to 4 technology tools over a period of 1 hour and a half, and this made the pace of the workshop go very fast and caused confusion with the audience. This in my opinion was a poor experience because an the presenter did not include an evaluation, and without this it is very difficult to improve the training
Project 3
Activity Log
Listened to podcast and posted my response
Posted project 3
Recently I attended a session that could be considered a poor example. This I think is mainly due to the inexperience of the presenter. The presenter is very knowledgeable about the topic, but lacks the overall experience on how to address the many parts of the presentation. One example is that his presentation was constantly stopped to answer questions from the audience. It is important to answer questions, but it is also very disruptive for the rest of the audience. Also, this training covered the use of 3 to 4 technology tools over a period of 1 hour and a half, and this made the pace of the workshop go very fast and caused confusion with the audience. This in my opinion was a poor experience because an the presenter did not include an evaluation, and without this it is very difficult to improve the training
Project 3
Activity Log
Listened to podcast and posted my response
Posted project 3
Sunday, March 4, 2007
Session 8
Data Driven Decision Making
The program that I work here at CSUSB implements data driven decision making constantly. Our funding is generated by the number of student teachers working as interns in different school districts across the inland empire. Therefore, it is important for us to maintain constant communication with our students in order to satisfy their needs. Last year the CCTC distributed a survey to be conducted by all intern programs requesting information from interns across the state. The survey collected information about the intern’s opinions about the program overall, which included their experience in the classroom as a student as well as a teacher in their district. The information collected would help the CCTC provide supporting evidence for their requests in creating legislations that would provide more funding to intern programs across the state. This information assisted not only our program, but the also the College of Education in understanding the student’s feelings about courses, faculty, staff, and specific program requirements.
The positive use of data driven decision making in this example is that it can possibly increase the funding for intern programs in institutions across the state. Also, this information can help intern programs improve their student’s experience in the program by making the appropriate changes to satisfy student’s concerns and needs.
The negative use of data driven decision making can be the collection process for this information. In our university we conduct many surveys to collect this information. Students are constantly bombarded with surveys and evaluations that are used to conduct data driven decision making and that can generate a negative response from students as they takes time away from their responsibilities. When we conducted our survey for example, the College of Education had sent two other surveys to exiting credential students that resulted in a low response rate.
Leadership is important in these situations. In our case the CCTC demanded above 80% response from all programs. Our program achieved this by creating a raffle in which we conducted three prize drawings on established deadlines so that by the final drawing most of the surveys were collected.
Proposal and presentation for Ed-Media 2007.
Project 2
Project 2 Powerpoint
Weekly tasks
Listened to podcast and posted response.
Posted comments on other blogs
Finalize and posted Project 2
Continue work on Project 3
The program that I work here at CSUSB implements data driven decision making constantly. Our funding is generated by the number of student teachers working as interns in different school districts across the inland empire. Therefore, it is important for us to maintain constant communication with our students in order to satisfy their needs. Last year the CCTC distributed a survey to be conducted by all intern programs requesting information from interns across the state. The survey collected information about the intern’s opinions about the program overall, which included their experience in the classroom as a student as well as a teacher in their district. The information collected would help the CCTC provide supporting evidence for their requests in creating legislations that would provide more funding to intern programs across the state. This information assisted not only our program, but the also the College of Education in understanding the student’s feelings about courses, faculty, staff, and specific program requirements.
The positive use of data driven decision making in this example is that it can possibly increase the funding for intern programs in institutions across the state. Also, this information can help intern programs improve their student’s experience in the program by making the appropriate changes to satisfy student’s concerns and needs.
The negative use of data driven decision making can be the collection process for this information. In our university we conduct many surveys to collect this information. Students are constantly bombarded with surveys and evaluations that are used to conduct data driven decision making and that can generate a negative response from students as they takes time away from their responsibilities. When we conducted our survey for example, the College of Education had sent two other surveys to exiting credential students that resulted in a low response rate.
Leadership is important in these situations. In our case the CCTC demanded above 80% response from all programs. Our program achieved this by creating a raffle in which we conducted three prize drawings on established deadlines so that by the final drawing most of the surveys were collected.
Proposal and presentation for Ed-Media 2007.
Project 2
Project 2 Powerpoint
Weekly tasks
Listened to podcast and posted response.
Posted comments on other blogs
Finalize and posted Project 2
Continue work on Project 3
Subscribe to:
Posts (Atom)